Jan 20 11:04:06 crc systemd[1]: Starting Kubernetes Kubelet... Jan 20 11:04:06 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:06 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 11:04:07 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 20 11:04:07 crc kubenswrapper[4961]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.338783 4961 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341425 4961 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341442 4961 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341446 4961 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341451 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341454 4961 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341459 4961 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341465 4961 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341469 4961 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341472 4961 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341477 4961 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341481 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341485 4961 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341488 4961 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341492 4961 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341495 4961 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341501 4961 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341506 4961 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341510 4961 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341514 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341518 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341522 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341526 4961 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341529 4961 feature_gate.go:330] unrecognized feature gate: Example Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341533 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341537 4961 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341540 4961 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341544 4961 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341547 4961 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341551 4961 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341554 4961 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341558 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341561 4961 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341564 4961 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341738 4961 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341742 4961 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341745 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341750 4961 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341754 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341758 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341763 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341766 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341770 4961 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341774 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341778 4961 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341782 4961 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341786 4961 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341791 4961 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341794 4961 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341798 4961 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341802 4961 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341805 4961 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341809 4961 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341813 4961 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341816 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341820 4961 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341823 4961 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341828 4961 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341831 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341835 4961 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341838 4961 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341842 4961 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341845 4961 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341849 4961 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341853 4961 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341856 4961 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341861 4961 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341866 4961 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341870 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341873 4961 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341877 4961 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.341882 4961 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342328 4961 flags.go:64] FLAG: --address="0.0.0.0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342346 4961 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342353 4961 flags.go:64] FLAG: --anonymous-auth="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342361 4961 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342366 4961 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342372 4961 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342378 4961 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342384 4961 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342388 4961 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342393 4961 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342397 4961 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342402 4961 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342406 4961 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342410 4961 flags.go:64] FLAG: --cgroup-root="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342414 4961 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342418 4961 flags.go:64] FLAG: --client-ca-file="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342422 4961 flags.go:64] FLAG: --cloud-config="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342426 4961 flags.go:64] FLAG: --cloud-provider="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342430 4961 flags.go:64] FLAG: --cluster-dns="[]" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342437 4961 flags.go:64] FLAG: --cluster-domain="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342441 4961 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342446 4961 flags.go:64] FLAG: --config-dir="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342450 4961 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342455 4961 flags.go:64] FLAG: --container-log-max-files="5" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342461 4961 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342465 4961 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342470 4961 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342474 4961 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342478 4961 flags.go:64] FLAG: --contention-profiling="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342482 4961 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342487 4961 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342491 4961 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342495 4961 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342501 4961 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342506 4961 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342510 4961 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342515 4961 flags.go:64] FLAG: --enable-load-reader="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342519 4961 flags.go:64] FLAG: --enable-server="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342524 4961 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342529 4961 flags.go:64] FLAG: --event-burst="100" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342533 4961 flags.go:64] FLAG: --event-qps="50" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342537 4961 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342541 4961 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342546 4961 flags.go:64] FLAG: --eviction-hard="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342552 4961 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342556 4961 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342560 4961 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342564 4961 flags.go:64] FLAG: --eviction-soft="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342568 4961 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342572 4961 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342576 4961 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342580 4961 flags.go:64] FLAG: --experimental-mounter-path="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342584 4961 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342588 4961 flags.go:64] FLAG: --fail-swap-on="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342592 4961 flags.go:64] FLAG: --feature-gates="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342598 4961 flags.go:64] FLAG: --file-check-frequency="20s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342602 4961 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342607 4961 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342611 4961 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342615 4961 flags.go:64] FLAG: --healthz-port="10248" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342620 4961 flags.go:64] FLAG: --help="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342624 4961 flags.go:64] FLAG: --hostname-override="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342628 4961 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342632 4961 flags.go:64] FLAG: --http-check-frequency="20s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342636 4961 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342640 4961 flags.go:64] FLAG: --image-credential-provider-config="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342644 4961 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342651 4961 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342655 4961 flags.go:64] FLAG: --image-service-endpoint="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342664 4961 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342668 4961 flags.go:64] FLAG: --kube-api-burst="100" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342672 4961 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342676 4961 flags.go:64] FLAG: --kube-api-qps="50" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342680 4961 flags.go:64] FLAG: --kube-reserved="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342684 4961 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342688 4961 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342693 4961 flags.go:64] FLAG: --kubelet-cgroups="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342697 4961 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342701 4961 flags.go:64] FLAG: --lock-file="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342705 4961 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342709 4961 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342713 4961 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342719 4961 flags.go:64] FLAG: --log-json-split-stream="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342723 4961 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342727 4961 flags.go:64] FLAG: --log-text-split-stream="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342731 4961 flags.go:64] FLAG: --logging-format="text" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342735 4961 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342740 4961 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342744 4961 flags.go:64] FLAG: --manifest-url="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342748 4961 flags.go:64] FLAG: --manifest-url-header="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342754 4961 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342758 4961 flags.go:64] FLAG: --max-open-files="1000000" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342764 4961 flags.go:64] FLAG: --max-pods="110" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342769 4961 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342773 4961 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342777 4961 flags.go:64] FLAG: --memory-manager-policy="None" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342781 4961 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342785 4961 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342789 4961 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342794 4961 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342805 4961 flags.go:64] FLAG: --node-status-max-images="50" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342817 4961 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342822 4961 flags.go:64] FLAG: --oom-score-adj="-999" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342826 4961 flags.go:64] FLAG: --pod-cidr="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342830 4961 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342837 4961 flags.go:64] FLAG: --pod-manifest-path="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342841 4961 flags.go:64] FLAG: --pod-max-pids="-1" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342845 4961 flags.go:64] FLAG: --pods-per-core="0" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342849 4961 flags.go:64] FLAG: --port="10250" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342876 4961 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342882 4961 flags.go:64] FLAG: --provider-id="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342886 4961 flags.go:64] FLAG: --qos-reserved="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342891 4961 flags.go:64] FLAG: --read-only-port="10255" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342895 4961 flags.go:64] FLAG: --register-node="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342900 4961 flags.go:64] FLAG: --register-schedulable="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342903 4961 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342914 4961 flags.go:64] FLAG: --registry-burst="10" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342919 4961 flags.go:64] FLAG: --registry-qps="5" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342923 4961 flags.go:64] FLAG: --reserved-cpus="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342927 4961 flags.go:64] FLAG: --reserved-memory="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342933 4961 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342938 4961 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342943 4961 flags.go:64] FLAG: --rotate-certificates="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342947 4961 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342951 4961 flags.go:64] FLAG: --runonce="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342955 4961 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342959 4961 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342964 4961 flags.go:64] FLAG: --seccomp-default="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342968 4961 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342973 4961 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342977 4961 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342981 4961 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342987 4961 flags.go:64] FLAG: --storage-driver-password="root" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342993 4961 flags.go:64] FLAG: --storage-driver-secure="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.342997 4961 flags.go:64] FLAG: --storage-driver-table="stats" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343001 4961 flags.go:64] FLAG: --storage-driver-user="root" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343006 4961 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343010 4961 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343015 4961 flags.go:64] FLAG: --system-cgroups="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343019 4961 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343025 4961 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343029 4961 flags.go:64] FLAG: --tls-cert-file="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343033 4961 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343039 4961 flags.go:64] FLAG: --tls-min-version="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343043 4961 flags.go:64] FLAG: --tls-private-key-file="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343048 4961 flags.go:64] FLAG: --topology-manager-policy="none" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343052 4961 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343056 4961 flags.go:64] FLAG: --topology-manager-scope="container" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343080 4961 flags.go:64] FLAG: --v="2" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343086 4961 flags.go:64] FLAG: --version="false" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343091 4961 flags.go:64] FLAG: --vmodule="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343097 4961 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343101 4961 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343222 4961 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343229 4961 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343234 4961 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343239 4961 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343244 4961 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343248 4961 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343253 4961 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343257 4961 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343260 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343264 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343268 4961 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343272 4961 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343283 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343287 4961 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343291 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343294 4961 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343298 4961 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343303 4961 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343308 4961 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343311 4961 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343316 4961 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343319 4961 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343323 4961 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343327 4961 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343330 4961 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343334 4961 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343337 4961 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343341 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343344 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343348 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343351 4961 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343355 4961 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343360 4961 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343364 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343369 4961 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343373 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343377 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343381 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343384 4961 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343387 4961 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343391 4961 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343396 4961 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343401 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343405 4961 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343411 4961 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343415 4961 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343418 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343422 4961 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343427 4961 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343431 4961 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343435 4961 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343439 4961 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343443 4961 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343447 4961 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343451 4961 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343455 4961 feature_gate.go:330] unrecognized feature gate: Example Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343458 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343462 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343466 4961 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343469 4961 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343473 4961 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343476 4961 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343480 4961 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343484 4961 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343487 4961 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343490 4961 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343494 4961 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343497 4961 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343501 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343505 4961 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.343509 4961 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.343521 4961 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.359151 4961 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.359220 4961 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359428 4961 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359448 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359458 4961 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359469 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359478 4961 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359489 4961 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359504 4961 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359514 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359524 4961 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359534 4961 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359542 4961 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359550 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359559 4961 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359567 4961 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359575 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359583 4961 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359591 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359598 4961 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359606 4961 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359617 4961 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359627 4961 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359635 4961 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359644 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359653 4961 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359661 4961 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359670 4961 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359678 4961 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359686 4961 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359693 4961 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359702 4961 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359709 4961 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359720 4961 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359736 4961 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359744 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359753 4961 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359761 4961 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359770 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359777 4961 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359785 4961 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359796 4961 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359806 4961 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359814 4961 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359822 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359831 4961 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359839 4961 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359847 4961 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359855 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359863 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359871 4961 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359879 4961 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359887 4961 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359895 4961 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359904 4961 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359913 4961 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359922 4961 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359931 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359940 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359950 4961 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359958 4961 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359967 4961 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359975 4961 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359983 4961 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.359991 4961 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360000 4961 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360008 4961 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360016 4961 feature_gate.go:330] unrecognized feature gate: Example Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360024 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360031 4961 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360042 4961 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360052 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360059 4961 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.360101 4961 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360356 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360374 4961 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360384 4961 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360393 4961 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360401 4961 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360410 4961 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360420 4961 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360428 4961 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360436 4961 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360444 4961 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360452 4961 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360460 4961 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360468 4961 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360476 4961 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360484 4961 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360492 4961 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360500 4961 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360507 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360516 4961 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360527 4961 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360539 4961 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360549 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360557 4961 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360569 4961 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360580 4961 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360589 4961 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360598 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360606 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360615 4961 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360623 4961 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360631 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360639 4961 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360648 4961 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360656 4961 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360665 4961 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360673 4961 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360684 4961 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360693 4961 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360702 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360710 4961 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360718 4961 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360727 4961 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360735 4961 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360743 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360751 4961 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360761 4961 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360771 4961 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360779 4961 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360787 4961 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360795 4961 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360803 4961 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360811 4961 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360820 4961 feature_gate.go:330] unrecognized feature gate: Example Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360828 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360836 4961 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360844 4961 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360853 4961 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360862 4961 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360870 4961 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360879 4961 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360887 4961 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360898 4961 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360909 4961 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360921 4961 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360943 4961 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360954 4961 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360965 4961 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360976 4961 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360987 4961 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.360996 4961 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.361006 4961 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.361023 4961 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.361398 4961 server.go:940] "Client rotation is on, will bootstrap in background" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.367107 4961 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.367246 4961 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.368393 4961 server.go:997] "Starting client certificate rotation" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.368444 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.369007 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-08 02:28:46.33829101 +0000 UTC Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.369222 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.376620 4961 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.378702 4961 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.381401 4961 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.393518 4961 log.go:25] "Validated CRI v1 runtime API" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.424106 4961 log.go:25] "Validated CRI v1 image API" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.426363 4961 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.429695 4961 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-20-11-00-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.429743 4961 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.456966 4961 manager.go:217] Machine: {Timestamp:2026-01-20 11:04:07.454716901 +0000 UTC m=+0.239216842 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0a5b697e-e547-4549-beeb-184b8e2414c0 BootID:ed553be1-be24-4e3a-9c6e-30826c2337ef Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:63:6d:24 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:63:6d:24 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:01:ee:0b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ee:b3:44 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b0:1e:6b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d1:24:ed Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:c8:65:b9:16:45 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:32:4a:c9:da:46:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.457413 4961 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.457613 4961 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.458142 4961 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.458735 4961 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.458797 4961 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.459231 4961 topology_manager.go:138] "Creating topology manager with none policy" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.459250 4961 container_manager_linux.go:303] "Creating device plugin manager" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.459603 4961 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.459658 4961 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.460098 4961 state_mem.go:36] "Initialized new in-memory state store" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.460628 4961 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.461718 4961 kubelet.go:418] "Attempting to sync node with API server" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.461751 4961 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.461793 4961 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.461817 4961 kubelet.go:324] "Adding apiserver pod source" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.461838 4961 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.464050 4961 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.464796 4961 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.465192 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.465263 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.465314 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.465337 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.466175 4961 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467186 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467239 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467254 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467270 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467294 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467308 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467324 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467348 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467367 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467382 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467423 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467437 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.467761 4961 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.468254 4961 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.468612 4961 server.go:1280] "Started kubelet" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.469236 4961 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.469385 4961 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 20 11:04:07 crc systemd[1]: Started Kubernetes Kubelet. Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.471244 4961 server.go:460] "Adding debug handlers to kubelet server" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.471344 4961 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.472945 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.473007 4961 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.473153 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:23:41.331190169 +0000 UTC Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.474471 4961 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.474505 4961 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.474660 4961 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.475110 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.475171 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.475197 4961 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.475874 4961 factory.go:55] Registering systemd factory Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.476052 4961 factory.go:221] Registration of the systemd container factory successfully Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.474246 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188c6b966d0c7311 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 11:04:07.468561169 +0000 UTC m=+0.253061080,LastTimestamp:2026-01-20 11:04:07.468561169 +0000 UTC m=+0.253061080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.476696 4961 factory.go:153] Registering CRI-O factory Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.476731 4961 factory.go:221] Registration of the crio container factory successfully Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.477383 4961 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.477428 4961 factory.go:103] Registering Raw factory Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.477457 4961 manager.go:1196] Started watching for new ooms in manager Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.478589 4961 manager.go:319] Starting recovery of all containers Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.480715 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="200ms" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497635 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497703 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497727 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497748 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497766 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497784 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497802 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497822 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497842 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497862 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497881 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497898 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497915 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497936 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497954 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497969 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497981 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.497994 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498008 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498025 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498041 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498082 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498150 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498168 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498184 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498199 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498219 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498236 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498255 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498273 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498328 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498348 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498368 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498387 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498407 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498426 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498443 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498460 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498476 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498492 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498509 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498550 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498570 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498586 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498604 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498621 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498641 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498657 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498676 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498694 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498716 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498734 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498761 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498803 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498825 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498845 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498863 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498883 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498902 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498920 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498937 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498962 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498983 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.498999 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499019 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499037 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499055 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499140 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499160 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499177 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499199 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499217 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499233 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499251 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499267 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499284 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499301 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499317 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499337 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499354 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499371 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499388 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499408 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499425 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499443 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499461 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499477 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499493 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499510 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499527 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499544 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499561 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499577 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499597 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499614 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499631 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499650 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499667 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499687 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499702 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499720 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499739 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499755 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499771 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499801 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499822 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499841 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499860 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499879 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499900 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499917 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499936 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499954 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499970 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.499988 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.500005 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.501908 4961 manager.go:324] Recovery completed Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.502424 4961 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503043 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503341 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503362 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503380 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503396 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503410 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503426 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503443 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503462 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503477 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503493 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503508 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503521 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503539 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503554 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503570 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503588 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503603 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503618 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503633 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503648 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503668 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503684 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503703 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503719 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503735 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503751 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503767 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503784 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503802 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503817 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503834 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503849 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503940 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503962 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503981 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.503997 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504014 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504030 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504046 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504094 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504113 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504128 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504143 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504159 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504192 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504210 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504225 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504241 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504259 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504276 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504291 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504311 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504328 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504344 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504357 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504375 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504391 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504407 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504427 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504443 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504462 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504478 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504493 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504508 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504524 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504540 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504556 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504571 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504589 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504606 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504622 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504640 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504654 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504670 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504686 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504700 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504720 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504736 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504753 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504768 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504784 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504801 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504816 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504876 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504892 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504909 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504925 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504943 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504957 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504973 4961 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504987 4961 reconstruct.go:97] "Volume reconstruction finished" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.504998 4961 reconciler.go:26] "Reconciler: start to sync state" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.514849 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.516462 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.516508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.516518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.517415 4961 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.517431 4961 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.517453 4961 state_mem.go:36] "Initialized new in-memory state store" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.526983 4961 policy_none.go:49] "None policy: Start" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.529899 4961 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.529971 4961 state_mem.go:35] "Initializing new in-memory state store" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.535639 4961 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.537607 4961 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.537664 4961 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.537700 4961 kubelet.go:2335] "Starting kubelet main sync loop" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.537764 4961 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 20 11:04:07 crc kubenswrapper[4961]: W0120 11:04:07.538628 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.538722 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.577211 4961 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.581288 4961 manager.go:334] "Starting Device Plugin manager" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.581659 4961 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.581682 4961 server.go:79] "Starting device plugin registration server" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.582290 4961 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.582333 4961 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.582585 4961 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.582709 4961 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.582729 4961 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.589923 4961 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.638846 4961 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.639120 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.640465 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.640511 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.640525 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.640706 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.640971 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.641037 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642142 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642175 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642186 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642273 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642344 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642823 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.642940 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643128 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643145 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643251 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643416 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.643461 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.645894 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646030 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.645948 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646387 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646401 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.645987 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646500 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646512 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646700 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646841 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.646884 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648039 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648080 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648090 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648257 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648274 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648283 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648412 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.648438 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.650179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.650206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.650216 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.682458 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="400ms" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.682592 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.684616 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.684757 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.684833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.684916 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.685853 4961 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708576 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708721 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708748 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708820 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708842 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708904 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.708928 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709096 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709126 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709221 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709252 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709280 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709376 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709430 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.709470 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.810987 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811116 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811142 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811165 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811186 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811202 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811219 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811235 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811254 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811303 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811321 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811380 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811397 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811413 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.811430 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812080 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812148 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812176 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812208 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812232 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812284 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812305 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812327 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812358 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812381 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812409 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812435 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812456 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812477 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.812089 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.886642 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.888242 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.888313 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.888325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.888365 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:07 crc kubenswrapper[4961]: E0120 11:04:07.889033 4961 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Jan 20 11:04:07 crc kubenswrapper[4961]: I0120 11:04:07.999895 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.019656 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.027160 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.034099 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-60df986562179324700e226a58d6b618db4c29ac448f7c05edc9e1537d7bff36 WatchSource:0}: Error finding container 60df986562179324700e226a58d6b618db4c29ac448f7c05edc9e1537d7bff36: Status 404 returned error can't find the container with id 60df986562179324700e226a58d6b618db4c29ac448f7c05edc9e1537d7bff36 Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.044750 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-40199f9cbd5fb6205e1ebae84e689645314f8d598fbfe4c8d7349ac0cc310ace WatchSource:0}: Error finding container 40199f9cbd5fb6205e1ebae84e689645314f8d598fbfe4c8d7349ac0cc310ace: Status 404 returned error can't find the container with id 40199f9cbd5fb6205e1ebae84e689645314f8d598fbfe4c8d7349ac0cc310ace Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.050171 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.054665 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.065891 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-296f4081b2bfebc3a2e3360c9abe9d3927fd3cea4fac593f446ecab24effbdd0 WatchSource:0}: Error finding container 296f4081b2bfebc3a2e3360c9abe9d3927fd3cea4fac593f446ecab24effbdd0: Status 404 returned error can't find the container with id 296f4081b2bfebc3a2e3360c9abe9d3927fd3cea4fac593f446ecab24effbdd0 Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.084607 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="800ms" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.085687 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b49ed06e510b7e4f0ac58ffbfa79b5eaa989e77956e6cd1c3ff8929349e40db4 WatchSource:0}: Error finding container b49ed06e510b7e4f0ac58ffbfa79b5eaa989e77956e6cd1c3ff8929349e40db4: Status 404 returned error can't find the container with id b49ed06e510b7e4f0ac58ffbfa79b5eaa989e77956e6cd1c3ff8929349e40db4 Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.290097 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.291888 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.291964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.291980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.292022 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.292720 4961 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.319910 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.319992 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.469035 4961 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.473707 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:37:45.510444747 +0000 UTC Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.545505 4961 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6fbb752b7dc6d3ee5c788110ca8dae6fa4b94b77dad4470d896366eae5ff4ae2" exitCode=0 Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.545577 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6fbb752b7dc6d3ee5c788110ca8dae6fa4b94b77dad4470d896366eae5ff4ae2"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.545686 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b49ed06e510b7e4f0ac58ffbfa79b5eaa989e77956e6cd1c3ff8929349e40db4"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.545788 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.547079 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.547190 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.547207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.548284 4961 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8c5fe5c79777994a568ba00a032647fa1a79284dedfaf0bd481a7a8fa526c9d3" exitCode=0 Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.548367 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8c5fe5c79777994a568ba00a032647fa1a79284dedfaf0bd481a7a8fa526c9d3"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.548430 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"296f4081b2bfebc3a2e3360c9abe9d3927fd3cea4fac593f446ecab24effbdd0"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.548794 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.549680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.549702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.549712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.550925 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"315a88354b4b31aef2ff2afed632ace0acfd5b7bfd5f822722f82ad407c88bce"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.550958 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55c35f8d52f0dfb5bf4d4882eaeb369636e9540ee5bbde0e76ba242098483429"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.553717 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f" exitCode=0 Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.553770 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.553829 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40199f9cbd5fb6205e1ebae84e689645314f8d598fbfe4c8d7349ac0cc310ace"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.553958 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.554793 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.554831 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.554846 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.555940 4961 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="71aaee0ec6cac44676c7e1ee39ed3c03bf2ef5f6c00ce2703cccf8ea96183492" exitCode=0 Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.555989 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"71aaee0ec6cac44676c7e1ee39ed3c03bf2ef5f6c00ce2703cccf8ea96183492"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.556022 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"60df986562179324700e226a58d6b618db4c29ac448f7c05edc9e1537d7bff36"} Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.556176 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.556456 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557475 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557517 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557528 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557545 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557667 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:08 crc kubenswrapper[4961]: I0120 11:04:08.557689 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.568318 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.568420 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.756831 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.756924 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.886168 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="1.6s" Jan 20 11:04:08 crc kubenswrapper[4961]: W0120 11:04:08.935448 4961 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:08 crc kubenswrapper[4961]: E0120 11:04:08.935737 4961 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.093821 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.095651 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.095695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.095707 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.095733 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:09 crc kubenswrapper[4961]: E0120 11:04:09.096383 4961 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.241:6443: connect: connection refused" node="crc" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.474175 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:46:20.216535325 +0000 UTC Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.474794 4961 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.241:6443: connect: connection refused Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.483137 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 11:04:09 crc kubenswrapper[4961]: E0120 11:04:09.484242 4961 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.241:6443: connect: connection refused" logger="UnhandledError" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.561374 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"69f3da959db1c47c9c452b95a85040e026923fe4b5675e1c1318477a010d2ada"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.561570 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.564471 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.564508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.564519 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.566996 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c4f44db1582cc5c9f35ea90c5ce1bfe2c749dce5800d638d539fc39de5c304b4"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.567077 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"033a6d860911d52b901dfab29d3bb6132d573ed672c3798c0bcafaf61fcb4e6e"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.567097 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c0e2b9f31ba33e039a5e85695e14e6a2c3720c771b1f0c516681dcea8c9e9757"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.567198 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.568034 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.568087 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.568099 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.569394 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc2e0add51616e41f19205bd7fe05c854eea3e9b340b62368d289e68acb747c3"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.569423 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37d83d02d09fcdffbd9e9bf0d18356e6d81350723ee25a8d2361c8152c11e7ed"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.569433 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5463a84baf08187661405eb842d38a1e1fa0375c3a844042e71bf6d9f452fc7"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.569492 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.569997 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.570028 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.570038 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.574095 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.574129 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.574142 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.574156 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.577120 4961 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cfb935c7cbb9ab8609a5a4215afdd0ce5f5b5c092c85f435ec23ac4b7fcd4b64" exitCode=0 Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.577175 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cfb935c7cbb9ab8609a5a4215afdd0ce5f5b5c092c85f435ec23ac4b7fcd4b64"} Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.577448 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.578584 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.578619 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:09 crc kubenswrapper[4961]: I0120 11:04:09.578632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.474532 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:41:31.535127021 +0000 UTC Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.588469 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f"} Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.588605 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.589468 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.589505 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.589514 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591351 4961 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="88a350de337578c247acb33c73b55817c6b27fdf76af58064667126c716f6f5b" exitCode=0 Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591419 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"88a350de337578c247acb33c73b55817c6b27fdf76af58064667126c716f6f5b"} Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591436 4961 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591485 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591531 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.591563 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592294 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592314 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592330 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592341 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.592347 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.593380 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.593422 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.593444 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.697182 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.698328 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.698366 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.698376 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.698401 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:10 crc kubenswrapper[4961]: I0120 11:04:10.738056 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.475550 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 21:51:18.843841397 +0000 UTC Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.554735 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599878 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"134fa197a066b4a0b61829032c1067a2670286be2bf63e19475c6b30e2565292"} Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599937 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2770ffc08b1c0b558627e2d48b6c0d4d83bcbbdafac889ad31d821ecdd877e96"} Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599949 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"42c35a03274a2c2adade0809a7a381e0a3fccb69d5f755741dcff42de59dbb68"} Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599959 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f232ed0983c57b03df98b2740493675a1817614f20b9cf811a3260b8bc92559"} Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599968 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d014c3ff4a5711686d08fc08ec7993bd5ac4d1b579c36262c8091d018b94d8ab"} Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599967 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599995 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.599967 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.600103 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601578 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601597 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601625 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601643 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601651 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601661 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:11 crc kubenswrapper[4961]: I0120 11:04:11.601671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.256658 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.256936 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.258573 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.258666 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.258688 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.475637 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 01:35:58.772604738 +0000 UTC Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.602934 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.603893 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.603972 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:12 crc kubenswrapper[4961]: I0120 11:04:12.603994 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.476579 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 04:10:09.357207987 +0000 UTC Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.553335 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.553582 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.555336 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.555398 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.555423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.561805 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.605798 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.605943 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.607020 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.607121 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.607141 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.732491 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.732817 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.734827 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.734913 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.734934 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:13 crc kubenswrapper[4961]: I0120 11:04:13.877029 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.477763 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 14:32:50.091202766 +0000 UTC Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.608850 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.610001 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.610036 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.610048 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.769127 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.816172 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.816412 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.817687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.817726 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:14 crc kubenswrapper[4961]: I0120 11:04:14.817737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.257655 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.257776 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.478557 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 02:36:13.357024303 +0000 UTC Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.612119 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.613511 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.613564 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:15 crc kubenswrapper[4961]: I0120 11:04:15.613578 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:16 crc kubenswrapper[4961]: I0120 11:04:16.479724 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 02:55:04.311825751 +0000 UTC Jan 20 11:04:17 crc kubenswrapper[4961]: I0120 11:04:17.480510 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:55:41.280230456 +0000 UTC Jan 20 11:04:17 crc kubenswrapper[4961]: E0120 11:04:17.590331 4961 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.481495 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:35:39.12513249 +0000 UTC Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.537388 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.537584 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.539618 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.539684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:18 crc kubenswrapper[4961]: I0120 11:04:18.539705 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:19 crc kubenswrapper[4961]: I0120 11:04:19.481937 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 11:19:46.678383517 +0000 UTC Jan 20 11:04:19 crc kubenswrapper[4961]: I0120 11:04:19.885762 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 11:04:19 crc kubenswrapper[4961]: I0120 11:04:19.885854 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 11:04:19 crc kubenswrapper[4961]: I0120 11:04:19.895152 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 11:04:19 crc kubenswrapper[4961]: I0120 11:04:19.895249 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 11:04:20 crc kubenswrapper[4961]: I0120 11:04:20.482959 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:38:51.634420389 +0000 UTC Jan 20 11:04:20 crc kubenswrapper[4961]: I0120 11:04:20.505854 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 11:04:20 crc kubenswrapper[4961]: I0120 11:04:20.505929 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.281849 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.282178 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.283702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.283774 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.283798 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.319856 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.483137 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 12:25:21.002788552 +0000 UTC Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.631125 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.632608 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.632680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.632698 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:21 crc kubenswrapper[4961]: I0120 11:04:21.650397 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 20 11:04:22 crc kubenswrapper[4961]: I0120 11:04:22.484242 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:44:54.125591133 +0000 UTC Jan 20 11:04:22 crc kubenswrapper[4961]: I0120 11:04:22.633750 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:22 crc kubenswrapper[4961]: I0120 11:04:22.634818 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:22 crc kubenswrapper[4961]: I0120 11:04:22.634866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:22 crc kubenswrapper[4961]: I0120 11:04:22.634878 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.484936 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:39:51.248995407 +0000 UTC Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.742402 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.742649 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.743727 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.743919 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.744495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.744558 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.744584 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:23 crc kubenswrapper[4961]: I0120 11:04:23.750191 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.485771 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:56:32.902557034 +0000 UTC Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.640174 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.640736 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.640813 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.641748 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.641802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.641825 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.811095 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.811169 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 11:04:24 crc kubenswrapper[4961]: E0120 11:04:24.888372 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.891149 4961 trace.go:236] Trace[911120774]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 11:04:10.971) (total time: 13919ms): Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[911120774]: ---"Objects listed" error: 13919ms (11:04:24.891) Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[911120774]: [13.91994355s] [13.91994355s] END Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.891188 4961 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.891496 4961 trace.go:236] Trace[974482684]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 11:04:10.634) (total time: 14257ms): Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[974482684]: ---"Objects listed" error: 14257ms (11:04:24.891) Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[974482684]: [14.257081206s] [14.257081206s] END Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.891579 4961 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.893241 4961 trace.go:236] Trace[905584085]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 11:04:10.171) (total time: 14721ms): Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[905584085]: ---"Objects listed" error: 14721ms (11:04:24.893) Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[905584085]: [14.721335882s] [14.721335882s] END Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.893271 4961 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.893291 4961 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 20 11:04:24 crc kubenswrapper[4961]: E0120 11:04:24.893438 4961 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.894657 4961 trace.go:236] Trace[750416000]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 11:04:10.638) (total time: 14255ms): Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[750416000]: ---"Objects listed" error: 14255ms (11:04:24.894) Jan 20 11:04:24 crc kubenswrapper[4961]: Trace[750416000]: [14.255623314s] [14.255623314s] END Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.894688 4961 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 11:04:24 crc kubenswrapper[4961]: I0120 11:04:24.904583 4961 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.257341 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.257426 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.485994 4961 apiserver.go:52] "Watching apiserver" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.485933 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 20:09:11.78039535 +0000 UTC Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.490318 4961 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.490828 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491377 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491408 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491539 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491539 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.491674 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.491782 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491837 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.491874 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.492047 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.493617 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.494037 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495032 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495148 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495490 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495576 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495920 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.495091 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.496342 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.528963 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.555603 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.573373 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.576321 4961 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.596598 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.596881 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.596929 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.596967 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.596999 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597234 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597266 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597295 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597332 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597367 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597399 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597447 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597479 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597510 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597510 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597538 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597570 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597604 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597636 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597665 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597695 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597725 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597757 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597788 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597819 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597850 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597880 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597911 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597943 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.597980 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598011 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598044 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598097 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598129 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598138 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598080 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598136 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598229 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598158 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598368 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598408 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598682 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598535 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598818 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598855 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598898 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598931 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598965 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.598996 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599024 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599078 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599104 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599128 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599127 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599158 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599187 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599211 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599236 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599263 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599286 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599257 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599308 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599342 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599427 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599473 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599506 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599537 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599566 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599585 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599591 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599614 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599668 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599694 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599720 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599712 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599746 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599747 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599767 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599785 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599807 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599832 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599854 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599873 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599893 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599917 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599940 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599964 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599989 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600011 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600029 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600051 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600092 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600115 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600135 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600155 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600179 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600201 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600222 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600245 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600273 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600316 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600347 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600377 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600404 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600428 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600452 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600477 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600499 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600524 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600545 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600569 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600592 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600616 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600636 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600658 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600683 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601599 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601627 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601646 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601672 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601696 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601719 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601748 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601776 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601802 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601832 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601860 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601880 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601917 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601939 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601957 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601978 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601996 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602014 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602035 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602078 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602109 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602134 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602157 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602218 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602246 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602270 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602298 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602321 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602340 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602361 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602378 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602394 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602410 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602426 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602444 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602461 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602481 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602499 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602519 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602539 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602558 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602632 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602654 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602672 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602690 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602787 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602808 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602826 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602843 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602863 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602883 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602902 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602919 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602944 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602962 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602980 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602997 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603014 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603030 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603049 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603081 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603101 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603121 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603138 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603158 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603176 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603198 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603217 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603236 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603257 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603280 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603301 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603322 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603343 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603364 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603384 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603402 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603421 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603438 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603456 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603475 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603493 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603514 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603532 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603550 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603566 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603585 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603605 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603624 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603655 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603673 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603696 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603715 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603734 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603751 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603770 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603815 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603837 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603858 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603883 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603904 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603924 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603945 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603969 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603988 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604008 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604031 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604049 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604089 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604110 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604177 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604190 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604201 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604211 4961 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604223 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604234 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604245 4961 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604255 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604267 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604277 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604287 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604298 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604308 4961 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604318 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604328 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604338 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.599799 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600178 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605502 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600251 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600354 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600418 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600924 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.600935 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601010 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601139 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601274 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601446 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601486 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601796 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601898 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.601920 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602036 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602491 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602551 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603084 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603132 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.602874 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603234 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603245 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603683 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.603790 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604045 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604293 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604367 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604623 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.604637 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605094 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605230 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605311 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605448 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.605872 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606010 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606136 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606288 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606401 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606647 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606669 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606788 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.606933 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:26.106902496 +0000 UTC m=+18.891402367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.606986 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.607304 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.607334 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.607411 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.607471 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.608218 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.608516 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.608851 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.608957 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609049 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609086 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609192 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609365 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609487 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609744 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609819 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609888 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.609973 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610037 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610100 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610175 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610315 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610586 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610640 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610596 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610816 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610929 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.610975 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.611495 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.611733 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.611948 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.612174 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.612242 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.612373 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.612790 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.613015 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.613398 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.613569 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.613743 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.613754 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.614178 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.614303 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.615452 4961 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.615715 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.615897 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.615998 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:26.115969241 +0000 UTC m=+18.900469312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.616166 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.616384 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.616925 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.617229 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.617414 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.619697 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620099 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620215 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620772 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620883 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620849 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620912 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.618412 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.620927 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.621009 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.621031 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.621812 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.617770 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.622123 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.623266 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.623301 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.623376 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.623696 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.624009 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.624105 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.624148 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.624183 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.624211 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:26.124178595 +0000 UTC m=+18.908678466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.624455 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.624696 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625089 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625210 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625304 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625580 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625737 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.625754 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.626480 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.627434 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.627628 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.628882 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.630447 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.631530 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.638335 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.638512 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.639389 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.639459 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.639488 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.639516 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.639607 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.640188 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.640209 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.640223 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.640290 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:26.140273086 +0000 UTC m=+18.924772957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.640454 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.640813 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.641249 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.641273 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.641289 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:25 crc kubenswrapper[4961]: E0120 11:04:25.641373 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:26.141335411 +0000 UTC m=+18.925835502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.647215 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.650363 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.651773 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.651870 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.651915 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.651973 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.652681 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.652943 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.653215 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.653506 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.654091 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f" exitCode=255 Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.654101 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f"} Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.656695 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.659606 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660384 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.659964 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660295 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660417 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660511 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660799 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.660979 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.662451 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.662546 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.662867 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.663389 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.663754 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.663757 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.663831 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.664138 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.664444 4961 scope.go:117] "RemoveContainer" containerID="5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.664988 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665017 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665221 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665390 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665497 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665530 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.666021 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.665921 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.667300 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.669132 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.669719 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.670093 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.675640 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.675845 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.676640 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.676725 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.676930 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.677087 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.677365 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.677607 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.678173 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.678345 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.678582 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.679807 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.681164 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.681250 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.682309 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.693527 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.695242 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706032 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706704 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706554 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706807 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706661 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.706999 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707019 4961 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707032 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707047 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707081 4961 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707060 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707128 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707143 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707156 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707169 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707183 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707196 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707209 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707222 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707234 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707244 4961 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707257 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707268 4961 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707281 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707293 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707306 4961 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707318 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707331 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707343 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707355 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707365 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707377 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707390 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707402 4961 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707416 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707430 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707442 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707454 4961 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707467 4961 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707478 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707489 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707503 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707521 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707533 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707544 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707555 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707565 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707577 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707588 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707602 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707613 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.707624 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.708850 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709707 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709731 4961 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709743 4961 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709753 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709763 4961 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709774 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709784 4961 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709794 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709826 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709837 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709863 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709873 4961 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709882 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709892 4961 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709902 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709912 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709922 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709931 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709941 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709949 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709958 4961 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709970 4961 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709980 4961 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.709993 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710003 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710012 4961 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710036 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710047 4961 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710057 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710082 4961 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710093 4961 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710104 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710113 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710123 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710154 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710166 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710178 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710189 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710198 4961 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710216 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710225 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710234 4961 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710243 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710252 4961 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710261 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710272 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710281 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710291 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710300 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710309 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710319 4961 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710328 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710338 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710382 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710393 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710403 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710414 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710423 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710433 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710445 4961 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710455 4961 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710466 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710477 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710488 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710497 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710510 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710521 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710531 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710540 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710553 4961 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710564 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710574 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710584 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710594 4961 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710604 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710613 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710622 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710632 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710641 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710650 4961 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710659 4961 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710668 4961 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710677 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710685 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710696 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710705 4961 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710714 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710723 4961 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710733 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710742 4961 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710751 4961 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710760 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710770 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710779 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710789 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710799 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710808 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710817 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710826 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710836 4961 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710845 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710855 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710865 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710875 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710887 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710896 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710906 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710915 4961 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710926 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710939 4961 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710948 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710960 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710971 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710980 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.710991 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711001 4961 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711011 4961 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711021 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711031 4961 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711042 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711053 4961 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711076 4961 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711087 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711097 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711107 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711117 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711128 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711139 4961 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711149 4961 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.711158 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.713655 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.722101 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8d950b7-2b32-443f-b6ea-67115df80c62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0120 11:04:24.894810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0120 11:04:24.895017 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 11:04:24.896637 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2072879064/tls.crt::/tmp/serving-cert-2072879064/tls.key\\\\\\\"\\\\nI0120 11:04:25.306352 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 11:04:25.308594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 11:04:25.308615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 11:04:25.308635 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 11:04:25.308641 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 11:04:25.313839 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0120 11:04:25.313847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0120 11:04:25.313868 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313875 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 11:04:25.313884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 11:04:25.313888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 11:04:25.313891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0120 11:04:25.316449 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T11:04:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.732975 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.743655 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.758226 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.806698 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.811896 4961 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.812002 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.814022 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 11:04:25 crc kubenswrapper[4961]: I0120 11:04:25.818833 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 11:04:25 crc kubenswrapper[4961]: W0120 11:04:25.835838 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7661d8a2e557e7d079ac41529178a8b2566db41f1bfd1912ea275f799bc4be85 WatchSource:0}: Error finding container 7661d8a2e557e7d079ac41529178a8b2566db41f1bfd1912ea275f799bc4be85: Status 404 returned error can't find the container with id 7661d8a2e557e7d079ac41529178a8b2566db41f1bfd1912ea275f799bc4be85 Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.114384 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.114746 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:27.11472561 +0000 UTC m=+19.899225481 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.215719 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.215759 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.215778 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.215800 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.215923 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.215939 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.215950 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216004 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:27.215990608 +0000 UTC m=+20.000490479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216042 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216102 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216103 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216137 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216249 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:27.216207423 +0000 UTC m=+20.000707334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216119 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216305 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:27.216274905 +0000 UTC m=+20.000774936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:26 crc kubenswrapper[4961]: E0120 11:04:26.216335 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:27.216319296 +0000 UTC m=+20.000819367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.486547 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:37:57.657154982 +0000 UTC Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.657570 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5b8b9dcc3c8d9c041abf22f05d4f1f41f84d45a066f7db5d629743799b7ae157"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.657639 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6e11c25c774002b3d9a5d6a70a3b434c32da8380374ed94a40a66ca22252d973"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.659742 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"71c8dc5a85dd0abe37ae62a5014a1a92a8c259965c43a7e17d219802d44f2b89"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.659799 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f5400a95614c6e565c3f3418fa20d1555162d457614a89661413883100bb6160"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.659814 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"33fbeafc17e1407c587e2a86e6da014122febf715a11f2da378c91c83a6f97a8"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.662598 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.665250 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.666208 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.667684 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7661d8a2e557e7d079ac41529178a8b2566db41f1bfd1912ea275f799bc4be85"} Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.679876 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.698842 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.714090 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.731291 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8d950b7-2b32-443f-b6ea-67115df80c62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0120 11:04:24.894810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0120 11:04:24.895017 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 11:04:24.896637 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2072879064/tls.crt::/tmp/serving-cert-2072879064/tls.key\\\\\\\"\\\\nI0120 11:04:25.306352 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 11:04:25.308594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 11:04:25.308615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 11:04:25.308635 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 11:04:25.308641 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 11:04:25.313839 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0120 11:04:25.313847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0120 11:04:25.313868 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313875 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 11:04:25.313884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 11:04:25.313888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 11:04:25.313891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0120 11:04:25.316449 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T11:04:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.749503 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.770146 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b8b9dcc3c8d9c041abf22f05d4f1f41f84d45a066f7db5d629743799b7ae157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.787317 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.806545 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.823554 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c8dc5a85dd0abe37ae62a5014a1a92a8c259965c43a7e17d219802d44f2b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5400a95614c6e565c3f3418fa20d1555162d457614a89661413883100bb6160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.842175 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.857226 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8d950b7-2b32-443f-b6ea-67115df80c62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0120 11:04:24.894810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0120 11:04:24.895017 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 11:04:24.896637 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2072879064/tls.crt::/tmp/serving-cert-2072879064/tls.key\\\\\\\"\\\\nI0120 11:04:25.306352 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 11:04:25.308594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 11:04:25.308615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 11:04:25.308635 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 11:04:25.308641 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 11:04:25.313839 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0120 11:04:25.313847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0120 11:04:25.313868 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313875 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 11:04:25.313884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 11:04:25.313888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 11:04:25.313891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0120 11:04:25.316449 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T11:04:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.871983 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.885636 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b8b9dcc3c8d9c041abf22f05d4f1f41f84d45a066f7db5d629743799b7ae157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:26 crc kubenswrapper[4961]: I0120 11:04:26.900947 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:26Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.123723 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.123849 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:29.123824933 +0000 UTC m=+21.908324804 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.225100 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.225308 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.225331 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.225487 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225266 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225534 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225549 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225608 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:29.225590472 +0000 UTC m=+22.010090343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225403 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225641 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225650 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225676 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:29.225667564 +0000 UTC m=+22.010167435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225681 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225723 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:29.225712865 +0000 UTC m=+22.010212736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225461 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.225766 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:29.225759146 +0000 UTC m=+22.010259017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.487133 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 22:37:19.16880891 +0000 UTC Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.538770 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.538936 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.539561 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.539668 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.539846 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:27 crc kubenswrapper[4961]: E0120 11:04:27.539943 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.543895 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.544808 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.546993 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.548719 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.550135 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.551159 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.552365 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.553612 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.554843 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.555824 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.555901 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.558178 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.560868 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.561734 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.562338 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.563048 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.563602 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.565436 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.565832 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.566474 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.567132 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.567712 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.568457 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.568916 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.569662 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.570173 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.570799 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.571479 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.572011 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.572771 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.573369 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.573960 4961 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.574712 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.577131 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.578048 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.579090 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.581147 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.582399 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.583341 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.584527 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.585782 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.590415 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.591687 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.593708 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.595096 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.595990 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.597418 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.599013 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.600668 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.601345 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.602527 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.603056 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.603429 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.604119 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.605452 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.606077 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.623762 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c8dc5a85dd0abe37ae62a5014a1a92a8c259965c43a7e17d219802d44f2b89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5400a95614c6e565c3f3418fa20d1555162d457614a89661413883100bb6160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.644497 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b8b9dcc3c8d9c041abf22f05d4f1f41f84d45a066f7db5d629743799b7ae157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.657088 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.670207 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8d950b7-2b32-443f-b6ea-67115df80c62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"le observer\\\\nW0120 11:04:24.894810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0120 11:04:24.895017 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 11:04:24.896637 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2072879064/tls.crt::/tmp/serving-cert-2072879064/tls.key\\\\\\\"\\\\nI0120 11:04:25.306352 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 11:04:25.308594 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 11:04:25.308615 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 11:04:25.308635 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 11:04:25.308641 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 11:04:25.313839 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0120 11:04:25.313847 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0120 11:04:25.313868 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313875 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 11:04:25.313881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 11:04:25.313884 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 11:04:25.313888 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 11:04:25.313891 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0120 11:04:25.316449 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T11:04:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T11:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T11:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T11:04:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:27 crc kubenswrapper[4961]: I0120 11:04:27.682738 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T11:04:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T11:04:27Z is after 2025-08-24T17:21:41Z" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.094053 4961 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.102628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.102695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.102778 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.102881 4961 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.111713 4961 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.111979 4961 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.113264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.113297 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.113309 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.113325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.113336 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T11:04:28Z","lastTransitionTime":"2026-01-20T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.487857 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 15:47:47.410480516 +0000 UTC Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.488003 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 20 11:04:28 crc kubenswrapper[4961]: I0120 11:04:28.500610 4961 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.145458 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.145670 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:33.145640875 +0000 UTC m=+25.930140776 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.246501 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.246606 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.246677 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.246757 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.246796 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.246841 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.246862 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.246955 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:33.246925234 +0000 UTC m=+26.031425135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.246974 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247149 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:33.247118118 +0000 UTC m=+26.031617989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247226 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247221 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247270 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247301 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247274 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:33.247265412 +0000 UTC m=+26.031765283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.247530 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:33.247446796 +0000 UTC m=+26.031946687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.538937 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.538987 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:29 crc kubenswrapper[4961]: I0120 11:04:29.539002 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.539128 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.539233 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:29 crc kubenswrapper[4961]: E0120 11:04:29.539424 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.678896 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"267616fe058f178e34296f08935a5ae074a4d8209dd96cec6455ddf7be73aa3a"} Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.751218 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=5.751194881 podStartE2EDuration="5.751194881s" podCreationTimestamp="2026-01-20 11:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:30.7372277 +0000 UTC m=+23.521727571" watchObservedRunningTime="2026-01-20 11:04:30.751194881 +0000 UTC m=+23.535694752" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.928034 4961 csr.go:261] certificate signing request csr-h4sdh is approved, waiting to be issued Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.934127 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gtl8q"] Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.934536 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.935438 4961 csr.go:257] certificate signing request csr-h4sdh is issued Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.935833 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2rldv"] Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.936236 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.937188 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.937419 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.937960 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.938301 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.939053 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.939196 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 11:04:30 crc kubenswrapper[4961]: I0120 11:04:30.939713 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.038745 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ws595"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.039223 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.042267 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.042326 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.042275 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.042401 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.045897 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.066311 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0d1a495-120d-4998-a1c4-f2d5080cd724-hosts-file\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.066376 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c39ab8c-896f-4793-82ab-b055918eb009-serviceca\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.066424 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2blv\" (UniqueName: \"kubernetes.io/projected/1c39ab8c-896f-4793-82ab-b055918eb009-kube-api-access-x2blv\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.066472 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c39ab8c-896f-4793-82ab-b055918eb009-host\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.066516 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phfwl\" (UniqueName: \"kubernetes.io/projected/c0d1a495-120d-4998-a1c4-f2d5080cd724-kube-api-access-phfwl\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167815 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phfwl\" (UniqueName: \"kubernetes.io/projected/c0d1a495-120d-4998-a1c4-f2d5080cd724-kube-api-access-phfwl\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167875 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-os-release\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167893 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-bin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167939 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-etc-kubernetes\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167965 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0d1a495-120d-4998-a1c4-f2d5080cd724-hosts-file\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.167989 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168003 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-netns\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168019 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-multus-certs\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168035 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2hc\" (UniqueName: \"kubernetes.io/projected/f11b0f95-0c1e-4985-8743-fe9dceef8734-kube-api-access-gs2hc\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168049 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-daemon-config\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168083 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c39ab8c-896f-4793-82ab-b055918eb009-serviceca\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168098 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-cnibin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168111 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-socket-dir-parent\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168126 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-k8s-cni-cncf-io\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168147 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-system-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168162 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-conf-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168180 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2blv\" (UniqueName: \"kubernetes.io/projected/1c39ab8c-896f-4793-82ab-b055918eb009-kube-api-access-x2blv\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168197 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-multus\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168220 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c39ab8c-896f-4793-82ab-b055918eb009-host\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168236 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-kubelet\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168250 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-hostroot\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168264 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-cni-binary-copy\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.168675 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c0d1a495-120d-4998-a1c4-f2d5080cd724-hosts-file\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.169593 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c39ab8c-896f-4793-82ab-b055918eb009-serviceca\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.169817 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c39ab8c-896f-4793-82ab-b055918eb009-host\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.179266 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-48nk4"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.179689 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.182516 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.183154 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.183802 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.187685 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-crx8m"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.188398 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.192098 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.192212 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.193135 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.194448 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.201502 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phfwl\" (UniqueName: \"kubernetes.io/projected/c0d1a495-120d-4998-a1c4-f2d5080cd724-kube-api-access-phfwl\") pod \"node-resolver-2rldv\" (UID: \"c0d1a495-120d-4998-a1c4-f2d5080cd724\") " pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.205305 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2blv\" (UniqueName: \"kubernetes.io/projected/1c39ab8c-896f-4793-82ab-b055918eb009-kube-api-access-x2blv\") pod \"node-ca-gtl8q\" (UID: \"1c39ab8c-896f-4793-82ab-b055918eb009\") " pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.251224 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gtl8q" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.257446 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2rldv" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269106 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-cni-binary-copy\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269168 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-os-release\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269190 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-bin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269210 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-etc-kubernetes\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269247 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-rootfs\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269266 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-proxy-tls\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269285 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8x97\" (UniqueName: \"kubernetes.io/projected/9bb5f712-81d8-4313-943a-3acd4dfaf25c-kube-api-access-s8x97\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269323 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2hc\" (UniqueName: \"kubernetes.io/projected/f11b0f95-0c1e-4985-8743-fe9dceef8734-kube-api-access-gs2hc\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269341 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-socket-dir-parent\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269358 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-daemon-config\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269399 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-system-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269421 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-system-cni-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269444 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-multus\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269481 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-conf-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269501 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269519 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-kubelet\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269550 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-hostroot\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269568 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmw2\" (UniqueName: \"kubernetes.io/projected/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-kube-api-access-jxmw2\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269596 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-os-release\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269639 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269659 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-netns\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269679 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-multus-certs\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269714 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-binary-copy\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269729 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-cni-binary-copy\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269735 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-cnibin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269754 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-k8s-cni-cncf-io\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269785 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-multus\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269790 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-mcd-auth-proxy-config\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269840 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-os-release\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269851 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cnibin\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269866 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-cni-bin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269871 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269890 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-etc-kubernetes\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.269937 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-conf-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270014 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-var-lib-kubelet\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270042 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-hostroot\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270150 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270209 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-netns\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270227 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-socket-dir-parent\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270236 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-multus-certs\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270280 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-host-run-k8s-cni-cncf-io\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270480 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-system-cni-dir\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270515 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f11b0f95-0c1e-4985-8743-fe9dceef8734-cnibin\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.270665 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f11b0f95-0c1e-4985-8743-fe9dceef8734-multus-daemon-config\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.286816 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.287249 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.289436 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.289666 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.290053 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.291609 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.291879 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2hc\" (UniqueName: \"kubernetes.io/projected/f11b0f95-0c1e-4985-8743-fe9dceef8734-kube-api-access-gs2hc\") pod \"multus-ws595\" (UID: \"f11b0f95-0c1e-4985-8743-fe9dceef8734\") " pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.299768 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cpvtl"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.300272 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.300343 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.333239 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vzlgg"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.334380 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.337536 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.337557 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.339146 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.339292 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.339420 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.339713 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.346807 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.352457 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ws595" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370514 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d428a8c3-729d-4c3a-b1f6-253dbec8c153-service-ca\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370556 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-os-release\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370577 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-binary-copy\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370603 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-mcd-auth-proxy-config\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370621 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cnibin\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370638 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370663 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370680 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-rootfs\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370695 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-proxy-tls\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370712 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8x97\" (UniqueName: \"kubernetes.io/projected/9bb5f712-81d8-4313-943a-3acd4dfaf25c-kube-api-access-s8x97\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370727 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zvmz\" (UniqueName: \"kubernetes.io/projected/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-kube-api-access-9zvmz\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370746 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370765 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-system-cni-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370791 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d428a8c3-729d-4c3a-b1f6-253dbec8c153-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370809 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428a8c3-729d-4c3a-b1f6-253dbec8c153-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370828 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370845 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.370861 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmw2\" (UniqueName: \"kubernetes.io/projected/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-kube-api-access-jxmw2\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371144 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371180 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-rootfs\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371227 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-os-release\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371403 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-system-cni-dir\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371861 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cnibin\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371877 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-binary-copy\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.371918 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9bb5f712-81d8-4313-943a-3acd4dfaf25c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.372358 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-mcd-auth-proxy-config\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.388898 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-proxy-tls\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.389806 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8x97\" (UniqueName: \"kubernetes.io/projected/9bb5f712-81d8-4313-943a-3acd4dfaf25c-kube-api-access-s8x97\") pod \"multus-additional-cni-plugins-crx8m\" (UID: \"9bb5f712-81d8-4313-943a-3acd4dfaf25c\") " pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.391981 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmw2\" (UniqueName: \"kubernetes.io/projected/8a5754ab-8fe3-41b8-b760-b3d154e89ba8-kube-api-access-jxmw2\") pod \"machine-config-daemon-48nk4\" (UID: \"8a5754ab-8fe3-41b8-b760-b3d154e89ba8\") " pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.471860 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d428a8c3-729d-4c3a-b1f6-253dbec8c153-service-ca\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.471915 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-slash\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.471952 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-kubelet\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.471976 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472001 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472020 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkn9d\" (UniqueName: \"kubernetes.io/projected/945c6471-74be-43a0-b1cf-7d084e5fa394-kube-api-access-xkn9d\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472044 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/945c6471-74be-43a0-b1cf-7d084e5fa394-ovn-node-metrics-cert\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472097 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-systemd-units\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472118 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-script-lib\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472143 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472262 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-systemd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.472276 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472316 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zvmz\" (UniqueName: \"kubernetes.io/projected/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-kube-api-access-9zvmz\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.472429 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs podName:f4c616c0-5852-4a0c-98e7-7d6af398ed2e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:31.972393368 +0000 UTC m=+24.756893299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs") pod "network-metrics-daemon-cpvtl" (UID: "f4c616c0-5852-4a0c-98e7-7d6af398ed2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472680 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-var-lib-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472719 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-ovn\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472741 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472808 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d428a8c3-729d-4c3a-b1f6-253dbec8c153-service-ca\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472814 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472892 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-netd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472947 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-node-log\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.472979 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-bin\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473024 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d428a8c3-729d-4c3a-b1f6-253dbec8c153-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473057 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428a8c3-729d-4c3a-b1f6-253dbec8c153-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473117 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473144 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-etc-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473163 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-log-socket\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473187 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-netns\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473205 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-config\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473243 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473263 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-env-overrides\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.473343 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d428a8c3-729d-4c3a-b1f6-253dbec8c153-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.477397 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d428a8c3-729d-4c3a-b1f6-253dbec8c153-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.488475 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.488926 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428a8c3-729d-4c3a-b1f6-253dbec8c153-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-65x6v\" (UID: \"d428a8c3-729d-4c3a-b1f6-253dbec8c153\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.489639 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zvmz\" (UniqueName: \"kubernetes.io/projected/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-kube-api-access-9zvmz\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.498136 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-crx8m" Jan 20 11:04:31 crc kubenswrapper[4961]: W0120 11:04:31.506391 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a5754ab_8fe3_41b8_b760_b3d154e89ba8.slice/crio-d228363ee7813f6817a3ea54e9502a9b7a05a20d617b36bfc426848d83d5a2e4 WatchSource:0}: Error finding container d228363ee7813f6817a3ea54e9502a9b7a05a20d617b36bfc426848d83d5a2e4: Status 404 returned error can't find the container with id d228363ee7813f6817a3ea54e9502a9b7a05a20d617b36bfc426848d83d5a2e4 Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.539842 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.539977 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.540250 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.540405 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.540701 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.540762 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576369 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-kubelet\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576405 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-slash\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576429 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576450 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkn9d\" (UniqueName: \"kubernetes.io/projected/945c6471-74be-43a0-b1cf-7d084e5fa394-kube-api-access-xkn9d\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576469 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576485 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/945c6471-74be-43a0-b1cf-7d084e5fa394-ovn-node-metrics-cert\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576526 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-systemd-units\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576545 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-script-lib\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576569 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-systemd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576587 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-var-lib-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576602 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-ovn\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576618 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-netd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576640 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-node-log\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576657 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-bin\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576673 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-log-socket\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.576689 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-etc-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577405 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-netns\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577430 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-config\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577455 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577471 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-env-overrides\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577468 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-netd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577573 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-systemd\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577603 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-var-lib-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577631 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-ovn\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577911 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-script-lib\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577991 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-env-overrides\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.577999 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-systemd-units\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578017 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-kubelet\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578052 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-slash\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578051 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-run-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578098 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-node-log\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578107 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578125 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-etc-openvswitch\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578150 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-cni-bin\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578175 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-log-socket\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578205 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578242 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/945c6471-74be-43a0-b1cf-7d084e5fa394-host-run-netns\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.578676 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/945c6471-74be-43a0-b1cf-7d084e5fa394-ovnkube-config\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.587588 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/945c6471-74be-43a0-b1cf-7d084e5fa394-ovn-node-metrics-cert\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.606399 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkn9d\" (UniqueName: \"kubernetes.io/projected/945c6471-74be-43a0-b1cf-7d084e5fa394-kube-api-access-xkn9d\") pod \"ovnkube-node-vzlgg\" (UID: \"945c6471-74be-43a0-b1cf-7d084e5fa394\") " pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.615281 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" Jan 20 11:04:31 crc kubenswrapper[4961]: W0120 11:04:31.648685 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd428a8c3_729d_4c3a_b1f6_253dbec8c153.slice/crio-518ed177c90385d890b16425727042f7a4eee35d08e8aff1e75a87ada38e3d56 WatchSource:0}: Error finding container 518ed177c90385d890b16425727042f7a4eee35d08e8aff1e75a87ada38e3d56: Status 404 returned error can't find the container with id 518ed177c90385d890b16425727042f7a4eee35d08e8aff1e75a87ada38e3d56 Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.658693 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.695807 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerStarted","Data":"65e94dbd2064bee825dcfb2bda70f4deaf0cd8060e7eb1c45174efb67cd623ea"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.701506 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"d228363ee7813f6817a3ea54e9502a9b7a05a20d617b36bfc426848d83d5a2e4"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.707759 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ws595" event={"ID":"f11b0f95-0c1e-4985-8743-fe9dceef8734","Type":"ContainerStarted","Data":"365d1502e45af9ebe958d8e5774ad888f2fc611dc8b5a03a612c36a74901e411"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.707821 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ws595" event={"ID":"f11b0f95-0c1e-4985-8743-fe9dceef8734","Type":"ContainerStarted","Data":"70077458f24a08e366349ee28fc186bc94ed7f6c1f20c1d260e2a618fe2d3a38"} Jan 20 11:04:31 crc kubenswrapper[4961]: W0120 11:04:31.708680 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945c6471_74be_43a0_b1cf_7d084e5fa394.slice/crio-4eacf7b42391b6ea67dbaa46c1ca571efcb38e6e18076a3bdb9c1fefd2c9b9b9 WatchSource:0}: Error finding container 4eacf7b42391b6ea67dbaa46c1ca571efcb38e6e18076a3bdb9c1fefd2c9b9b9: Status 404 returned error can't find the container with id 4eacf7b42391b6ea67dbaa46c1ca571efcb38e6e18076a3bdb9c1fefd2c9b9b9 Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.710003 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gtl8q" event={"ID":"1c39ab8c-896f-4793-82ab-b055918eb009","Type":"ContainerStarted","Data":"759e7d2699cf78ce2d175fa8b4e44cc5480f4d74192941f96407477588e59d04"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.710030 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gtl8q" event={"ID":"1c39ab8c-896f-4793-82ab-b055918eb009","Type":"ContainerStarted","Data":"c86de8640a7dcb245dcd63cd8307fd870776de457145524820a4afcd0f04a2dd"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.711475 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" event={"ID":"d428a8c3-729d-4c3a-b1f6-253dbec8c153","Type":"ContainerStarted","Data":"518ed177c90385d890b16425727042f7a4eee35d08e8aff1e75a87ada38e3d56"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.718924 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2rldv" event={"ID":"c0d1a495-120d-4998-a1c4-f2d5080cd724","Type":"ContainerStarted","Data":"b8d28e1adab17c64b03e9852a93d9250cc72e3093d8fb64d3ca7c792d5f9d1e1"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.718968 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2rldv" event={"ID":"c0d1a495-120d-4998-a1c4-f2d5080cd724","Type":"ContainerStarted","Data":"0f9c280f297b5f944e2a478458c45ef63d05859fd418b52f6dc19bb1802a4050"} Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.834876 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2rldv" podStartSLOduration=1.8348522200000001 podStartE2EDuration="1.83485222s" podCreationTimestamp="2026-01-20 11:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:31.832423802 +0000 UTC m=+24.616923683" watchObservedRunningTime="2026-01-20 11:04:31.83485222 +0000 UTC m=+24.619352081" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.835988 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ws595" podStartSLOduration=0.835982097 podStartE2EDuration="835.982097ms" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:31.791035422 +0000 UTC m=+24.575535293" watchObservedRunningTime="2026-01-20 11:04:31.835982097 +0000 UTC m=+24.620481978" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.862804 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gtl8q" podStartSLOduration=1.862778681 podStartE2EDuration="1.862778681s" podCreationTimestamp="2026-01-20 11:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:31.852366445 +0000 UTC m=+24.636866316" watchObservedRunningTime="2026-01-20 11:04:31.862778681 +0000 UTC m=+24.647278552" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.919879 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw"] Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.920546 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.922758 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.922979 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.937345 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-20 10:59:30 +0000 UTC, rotation deadline is 2026-10-28 07:55:21.057040566 +0000 UTC Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.937441 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6740h50m49.119602697s for next certificate rotation Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.982171 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.982226 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2gz\" (UniqueName: \"kubernetes.io/projected/53e5c2fa-e067-4a84-8bfb-936b98372d6b-kube-api-access-rm2gz\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.982258 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.982320 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:31 crc kubenswrapper[4961]: I0120 11:04:31.982351 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.982366 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:31 crc kubenswrapper[4961]: E0120 11:04:31.982447 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs podName:f4c616c0-5852-4a0c-98e7-7d6af398ed2e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:32.982425764 +0000 UTC m=+25.766925635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs") pod "network-metrics-daemon-cpvtl" (UID: "f4c616c0-5852-4a0c-98e7-7d6af398ed2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.083192 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.083267 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2gz\" (UniqueName: \"kubernetes.io/projected/53e5c2fa-e067-4a84-8bfb-936b98372d6b-kube-api-access-rm2gz\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.083300 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.083365 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.083858 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.084049 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.087848 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53e5c2fa-e067-4a84-8bfb-936b98372d6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.099607 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2gz\" (UniqueName: \"kubernetes.io/projected/53e5c2fa-e067-4a84-8bfb-936b98372d6b-kube-api-access-rm2gz\") pod \"ovnkube-control-plane-749d76644c-hfssw\" (UID: \"53e5c2fa-e067-4a84-8bfb-936b98372d6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.261886 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.265942 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.267177 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.270852 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 20 11:04:32 crc kubenswrapper[4961]: W0120 11:04:32.283100 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e5c2fa_e067_4a84_8bfb_936b98372d6b.slice/crio-bf277015e35fedd278d9538f5a466439c237ee5a6c6b3c3ce025f362fd06f4ab WatchSource:0}: Error finding container bf277015e35fedd278d9538f5a466439c237ee5a6c6b3c3ce025f362fd06f4ab: Status 404 returned error can't find the container with id bf277015e35fedd278d9538f5a466439c237ee5a6c6b3c3ce025f362fd06f4ab Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.538851 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:32 crc kubenswrapper[4961]: E0120 11:04:32.539028 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.723713 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" event={"ID":"d428a8c3-729d-4c3a-b1f6-253dbec8c153","Type":"ContainerStarted","Data":"65ba126d6a242c62ddcdf444352af8c1b41e43d5a3ddc63aef92eb1a3e8a9af4"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.726388 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" event={"ID":"53e5c2fa-e067-4a84-8bfb-936b98372d6b","Type":"ContainerStarted","Data":"289c39ade59a2914981d467cd4e206400b892927dc2654e50e90d58ebafe0b9e"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.726440 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" event={"ID":"53e5c2fa-e067-4a84-8bfb-936b98372d6b","Type":"ContainerStarted","Data":"bf277015e35fedd278d9538f5a466439c237ee5a6c6b3c3ce025f362fd06f4ab"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.727747 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="603728297ffc10b0ea39a02ee298db91cde8fe20751430d5d4ee902771ba7198" exitCode=0 Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.727782 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"603728297ffc10b0ea39a02ee298db91cde8fe20751430d5d4ee902771ba7198"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.730415 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"48e0c2d092fe72226112a3363bfceac325fba74d3fbfeec3ae76153770e2cc67"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.730448 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.733291 4961 generic.go:334] "Generic (PLEG): container finished" podID="945c6471-74be-43a0-b1cf-7d084e5fa394" containerID="adb5d166e31a15f9e77c09513836dd27b111139c5d521c7942b9ace9d2658b2b" exitCode=0 Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.733681 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerDied","Data":"adb5d166e31a15f9e77c09513836dd27b111139c5d521c7942b9ace9d2658b2b"} Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.733719 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"4eacf7b42391b6ea67dbaa46c1ca571efcb38e6e18076a3bdb9c1fefd2c9b9b9"} Jan 20 11:04:32 crc kubenswrapper[4961]: E0120 11:04:32.742009 4961 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.743292 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-65x6v" podStartSLOduration=1.743268359 podStartE2EDuration="1.743268359s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:32.742848869 +0000 UTC m=+25.527348740" watchObservedRunningTime="2026-01-20 11:04:32.743268359 +0000 UTC m=+25.527768230" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.760680 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.760658811 podStartE2EDuration="760.658811ms" podCreationTimestamp="2026-01-20 11:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:32.759516074 +0000 UTC m=+25.544015945" watchObservedRunningTime="2026-01-20 11:04:32.760658811 +0000 UTC m=+25.545158682" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.843911 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podStartSLOduration=1.8438816120000001 podStartE2EDuration="1.843881612s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:32.843185725 +0000 UTC m=+25.627685616" watchObservedRunningTime="2026-01-20 11:04:32.843881612 +0000 UTC m=+25.628381493" Jan 20 11:04:32 crc kubenswrapper[4961]: I0120 11:04:32.993956 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:32 crc kubenswrapper[4961]: E0120 11:04:32.994151 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:32 crc kubenswrapper[4961]: E0120 11:04:32.994514 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs podName:f4c616c0-5852-4a0c-98e7-7d6af398ed2e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:34.994484628 +0000 UTC m=+27.778984519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs") pod "network-metrics-daemon-cpvtl" (UID: "f4c616c0-5852-4a0c-98e7-7d6af398ed2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.195980 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.196227 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:41.196189514 +0000 UTC m=+33.980689415 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.296926 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.297121 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297127 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297154 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297167 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.297183 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297224 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:41.297207895 +0000 UTC m=+34.081707766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.297253 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297293 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297333 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297352 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297491 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297506 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297358 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:41.297339959 +0000 UTC m=+34.081839900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297565 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:41.297541833 +0000 UTC m=+34.082041704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.297581 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:41.297571654 +0000 UTC m=+34.082071525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.537898 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.537993 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.538009 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.538093 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.538147 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:33 crc kubenswrapper[4961]: E0120 11:04:33.538256 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.739047 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" event={"ID":"53e5c2fa-e067-4a84-8bfb-936b98372d6b","Type":"ContainerStarted","Data":"524a106e6f6bef3e9ce869cc53afd40dc9b257fadc5f2b68a12523f38704defb"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.742616 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerStarted","Data":"f4c81e03d17771f68eca42e412d00fc491d20ad022526fc1de26b82a56d727aa"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748219 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"3a03bc8667f7fca2d671c259a31f09bd869548adbdf79e0035700d67aaeb68e7"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748285 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"a73ce561045cef8ea8f3d0279066ae342d103bf305022e7fb20d02a795035bfa"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748300 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"397078538c7ab3c0e5f8a4a4e96655e8a93f52cc40faedef60d84843c5a27357"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748312 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"4eb544d8838b57b84b9699b31db256c87890e74430fb465939f2e6d6ccb07ec9"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748325 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"cdfcbfcacfd19f4a21d1d4a654385d8315a289dd20bf6ebf0ad986ea7825656e"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.748335 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"137a4b0d75e60a6a4c583561721f6fae663c7362350a99a26050e93acd4dd9be"} Jan 20 11:04:33 crc kubenswrapper[4961]: I0120 11:04:33.755614 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hfssw" podStartSLOduration=2.755591808 podStartE2EDuration="2.755591808s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:33.754628285 +0000 UTC m=+26.539128156" watchObservedRunningTime="2026-01-20 11:04:33.755591808 +0000 UTC m=+26.540091679" Jan 20 11:04:34 crc kubenswrapper[4961]: I0120 11:04:34.538917 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:34 crc kubenswrapper[4961]: E0120 11:04:34.539115 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:34 crc kubenswrapper[4961]: I0120 11:04:34.753235 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="f4c81e03d17771f68eca42e412d00fc491d20ad022526fc1de26b82a56d727aa" exitCode=0 Jan 20 11:04:34 crc kubenswrapper[4961]: I0120 11:04:34.754003 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"f4c81e03d17771f68eca42e412d00fc491d20ad022526fc1de26b82a56d727aa"} Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.017168 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:35 crc kubenswrapper[4961]: E0120 11:04:35.017364 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:35 crc kubenswrapper[4961]: E0120 11:04:35.017627 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs podName:f4c616c0-5852-4a0c-98e7-7d6af398ed2e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:39.01760733 +0000 UTC m=+31.802107221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs") pod "network-metrics-daemon-cpvtl" (UID: "f4c616c0-5852-4a0c-98e7-7d6af398ed2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.538263 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.538292 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:35 crc kubenswrapper[4961]: E0120 11:04:35.538431 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.538262 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:35 crc kubenswrapper[4961]: E0120 11:04:35.538650 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:35 crc kubenswrapper[4961]: E0120 11:04:35.538778 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.760354 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="1c64e5b52850c6c8c22be60cdc78fdcdebf2a50ae2720bb4941d40db29faf70c" exitCode=0 Jan 20 11:04:35 crc kubenswrapper[4961]: I0120 11:04:35.760419 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"1c64e5b52850c6c8c22be60cdc78fdcdebf2a50ae2720bb4941d40db29faf70c"} Jan 20 11:04:36 crc kubenswrapper[4961]: I0120 11:04:36.538712 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:36 crc kubenswrapper[4961]: E0120 11:04:36.538914 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:36 crc kubenswrapper[4961]: I0120 11:04:36.797908 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="1ef26c0b558c18e5572a5e6f84c6b0f79e1c372cc77ae1ca4b9886ae16a58fff" exitCode=0 Jan 20 11:04:36 crc kubenswrapper[4961]: I0120 11:04:36.798002 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"1ef26c0b558c18e5572a5e6f84c6b0f79e1c372cc77ae1ca4b9886ae16a58fff"} Jan 20 11:04:36 crc kubenswrapper[4961]: I0120 11:04:36.811733 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"9249da1f954c059a05bd4a38d9914e5b7610512c03e54845c4530185485b29bd"} Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.370163 4961 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.696462 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:37 crc kubenswrapper[4961]: E0120 11:04:37.696629 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.696714 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:37 crc kubenswrapper[4961]: E0120 11:04:37.697164 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.697844 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:37 crc kubenswrapper[4961]: E0120 11:04:37.704342 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.819117 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="6932e061cab4fc055ab8f9b9d501c2c2bed84e244699aeace3a3379f39fc566d" exitCode=0 Jan 20 11:04:37 crc kubenswrapper[4961]: I0120 11:04:37.819175 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"6932e061cab4fc055ab8f9b9d501c2c2bed84e244699aeace3a3379f39fc566d"} Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.538023 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:38 crc kubenswrapper[4961]: E0120 11:04:38.538523 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.825864 4961 generic.go:334] "Generic (PLEG): container finished" podID="9bb5f712-81d8-4313-943a-3acd4dfaf25c" containerID="de27ca5e41b33ef026651a6e65cd7a01170a937b431b1f0c5774f97236c988d6" exitCode=0 Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.825952 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerDied","Data":"de27ca5e41b33ef026651a6e65cd7a01170a937b431b1f0c5774f97236c988d6"} Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.832894 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" event={"ID":"945c6471-74be-43a0-b1cf-7d084e5fa394","Type":"ContainerStarted","Data":"ccfba95f87de447e740fbe4c24833c71b89b8ee061efd48b6dbbdf1324340c38"} Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.833231 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.833274 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.866559 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:38 crc kubenswrapper[4961]: I0120 11:04:38.905089 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" podStartSLOduration=7.905044986 podStartE2EDuration="7.905044986s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:38.875760223 +0000 UTC m=+31.660260114" watchObservedRunningTime="2026-01-20 11:04:38.905044986 +0000 UTC m=+31.689544867" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.113365 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:39 crc kubenswrapper[4961]: E0120 11:04:39.113613 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:39 crc kubenswrapper[4961]: E0120 11:04:39.113716 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs podName:f4c616c0-5852-4a0c-98e7-7d6af398ed2e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.113687387 +0000 UTC m=+39.898187308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs") pod "network-metrics-daemon-cpvtl" (UID: "f4c616c0-5852-4a0c-98e7-7d6af398ed2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.538902 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.538902 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.539049 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:39 crc kubenswrapper[4961]: E0120 11:04:39.539236 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:39 crc kubenswrapper[4961]: E0120 11:04:39.539370 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:39 crc kubenswrapper[4961]: E0120 11:04:39.539470 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.843938 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-crx8m" event={"ID":"9bb5f712-81d8-4313-943a-3acd4dfaf25c","Type":"ContainerStarted","Data":"5bfb9a7613d1f25da3895fb7248d37f9380ab0d9ebd91d0d7bb5be972776e69a"} Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.844636 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.876117 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:04:39 crc kubenswrapper[4961]: I0120 11:04:39.916986 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-crx8m" podStartSLOduration=8.916951786 podStartE2EDuration="8.916951786s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:39.867101196 +0000 UTC m=+32.651601077" watchObservedRunningTime="2026-01-20 11:04:39.916951786 +0000 UTC m=+32.701451697" Jan 20 11:04:40 crc kubenswrapper[4961]: I0120 11:04:40.473172 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cpvtl"] Jan 20 11:04:40 crc kubenswrapper[4961]: I0120 11:04:40.473319 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:40 crc kubenswrapper[4961]: E0120 11:04:40.473423 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:40 crc kubenswrapper[4961]: I0120 11:04:40.509198 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.242307 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.242568 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:57.242531882 +0000 UTC m=+50.027031753 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.344030 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.344133 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.344180 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.344240 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344259 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344294 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344312 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344347 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344378 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344392 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:57.344370544 +0000 UTC m=+50.128870425 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344447 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:57.344433295 +0000 UTC m=+50.128933176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344467 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:57.344459596 +0000 UTC m=+50.128959477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344539 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344552 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344563 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.344609 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 11:04:57.344599979 +0000 UTC m=+50.129099860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.538666 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.538833 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.538911 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.539027 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 11:04:41 crc kubenswrapper[4961]: I0120 11:04:41.538667 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:41 crc kubenswrapper[4961]: E0120 11:04:41.539180 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 11:04:42 crc kubenswrapper[4961]: I0120 11:04:42.537913 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:42 crc kubenswrapper[4961]: E0120 11:04:42.538122 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cpvtl" podUID="f4c616c0-5852-4a0c-98e7-7d6af398ed2e" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.468011 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.468180 4961 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.504221 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.505254 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.512319 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.514345 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.514505 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.514364 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.514700 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7szjl"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.515195 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.515361 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.515427 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.516026 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.518040 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.523947 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.524598 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.524985 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.525371 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.531029 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fk28r"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.531585 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.531990 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.532174 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.533680 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.535731 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.536479 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.536526 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.536689 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.536511 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.537708 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.538135 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.539220 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.539784 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.540215 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.540474 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.540634 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.540788 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.540958 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541274 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541377 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541616 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541767 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541907 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.541954 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.542108 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.555407 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.556212 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.561130 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.562924 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.584596 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.585273 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.585710 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz72b"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.585982 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nszp2"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.586211 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.586450 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7r52t"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.586569 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.586765 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.587285 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2vqvf"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.588398 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.588550 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.588956 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589316 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-serving-cert\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589425 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-dir\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589536 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-encryption-config\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589626 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-policies\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589687 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589747 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24lw\" (UniqueName: \"kubernetes.io/projected/9f0a10d3-171e-4695-b66e-870bb63e5712-kube-api-access-x24lw\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589809 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.589887 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-client\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.590010 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.593326 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v7tsh"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.594243 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.594281 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.594815 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.602623 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t464w"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.603026 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.603391 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.603731 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.611818 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612178 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612295 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612401 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612515 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612692 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612830 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612929 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.612979 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613111 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613160 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613473 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613609 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613686 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613691 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613654 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.613619 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.614205 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.614329 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.615129 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.615433 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.615606 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.615762 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.615907 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.616088 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.616680 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.616801 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.616913 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617011 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617132 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617028 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617282 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617451 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617609 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.617724 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.618987 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.619099 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.619331 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.619759 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.621343 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.626178 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l62cr"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.626904 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.627403 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.627966 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.637384 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.638701 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.639656 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.640122 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.644734 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jjg29"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.646017 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.651504 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.651602 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.651512 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.660460 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.661214 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.661852 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.662174 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw9fs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.662974 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.664096 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.673507 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.673816 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.673999 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.674373 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.674405 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.674481 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.674654 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.674816 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.675043 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.675197 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.675301 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.676478 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.676487 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.676670 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.676947 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.677018 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.677074 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.677150 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.677693 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.677794 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.678000 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.678156 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.678201 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.682233 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.686721 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.687356 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.687814 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.688012 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.688297 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jq4wg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.689081 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.690079 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.690746 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.691134 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.691879 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.692461 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.692880 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2cs6z"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.693838 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-config\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.693891 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpzzq\" (UniqueName: \"kubernetes.io/projected/154d567d-eccf-4771-a5e0-60b2375d3e8b-kube-api-access-xpzzq\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.693921 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbd861e-1c54-4ec8-beca-021062bf2924-serving-cert\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.693962 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.694044 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.694313 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.694965 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695030 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695140 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-images\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695216 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-serving-cert\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695290 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-dir\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695313 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695791 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-dir\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.695972 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g84nh\" (UniqueName: \"kubernetes.io/projected/9cbd861e-1c54-4ec8-beca-021062bf2924-kube-api-access-g84nh\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.696182 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.696575 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-encryption-config\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697024 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/154d567d-eccf-4771-a5e0-60b2375d3e8b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697091 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-service-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697124 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99ff\" (UniqueName: \"kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697227 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x24lw\" (UniqueName: \"kubernetes.io/projected/9f0a10d3-171e-4695-b66e-870bb63e5712-kube-api-access-x24lw\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697276 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697310 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-policies\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697332 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697354 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmx7\" (UniqueName: \"kubernetes.io/projected/cf90e4e1-8fba-4df9-8e30-b392921d4d16-kube-api-access-hfmx7\") pod \"downloads-7954f5f757-7szjl\" (UID: \"cf90e4e1-8fba-4df9-8e30-b392921d4d16\") " pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697378 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kf5l\" (UniqueName: \"kubernetes.io/projected/1022948e-9743-4b7f-9e25-2e2b9070789c-kube-api-access-9kf5l\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697411 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-client\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697455 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1022948e-9743-4b7f-9e25-2e2b9070789c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.697485 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1022948e-9743-4b7f-9e25-2e2b9070789c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.698170 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.698493 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-config\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.698507 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9f0a10d3-171e-4695-b66e-870bb63e5712-audit-policies\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.698533 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.698750 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.699553 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.699840 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.700981 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.713100 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-serving-cert\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.713221 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.715128 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-encryption-config\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.713498 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f0a10d3-171e-4695-b66e-870bb63e5712-etcd-client\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.717082 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.718952 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.719624 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.720458 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.720931 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.721726 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.722460 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.722903 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.724383 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.724946 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.725407 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.725982 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.726859 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6xx2v"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.727404 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.727628 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.727906 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.728459 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.729058 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.730074 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.730979 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.732034 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fk28r"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.732951 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.733859 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.734805 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7szjl"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.735872 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7r52t"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.737650 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l62cr"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.738582 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t464w"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.739582 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.740573 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.741746 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-x6vpd"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.742530 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.742981 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dghc2"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.743778 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.744184 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nszp2"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.745090 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.746110 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.747072 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cczpb"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.747512 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.747993 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.748625 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.750781 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz72b"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.751837 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.753091 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-44twq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.753841 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-44twq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.754451 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.755254 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw9fs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.756435 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2vqvf"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.758130 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jq4wg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.759988 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.762490 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2cs6z"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.766561 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.769715 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.770646 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.771988 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jjg29"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.773131 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.774279 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xd5gg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.775991 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.776169 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.776348 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-44twq"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.778627 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.779625 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.781150 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v7tsh"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.782138 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.785851 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.787155 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x6vpd"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.788394 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.788822 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.789480 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.790610 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xd5gg"] Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799427 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799459 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799491 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpzzq\" (UniqueName: \"kubernetes.io/projected/154d567d-eccf-4771-a5e0-60b2375d3e8b-kube-api-access-xpzzq\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799509 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r2q5\" (UniqueName: \"kubernetes.io/projected/348fca40-a376-4560-a301-81c5d7dc93dd-kube-api-access-4r2q5\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799529 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-srv-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799549 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799571 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799588 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799605 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-serving-cert\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799622 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799638 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-serving-cert\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799656 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9888\" (UniqueName: \"kubernetes.io/projected/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-kube-api-access-w9888\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799672 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxrm\" (UniqueName: \"kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799691 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799709 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-images\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799725 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj599\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-kube-api-access-rj599\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799742 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-tmpfs\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799760 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799776 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799792 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799809 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvf8\" (UniqueName: \"kubernetes.io/projected/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-kube-api-access-9fvf8\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799831 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/154d567d-eccf-4771-a5e0-60b2375d3e8b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799848 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799880 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799895 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799913 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799929 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799947 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799965 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs7d2\" (UniqueName: \"kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.799988 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800014 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwvx\" (UniqueName: \"kubernetes.io/projected/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-kube-api-access-kbwvx\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800037 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-client\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800083 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb7e86c-90a7-48d9-a641-0814203fce0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800109 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68816dca-2483-4789-9db6-614582f6c45a-signing-key\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800129 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99ff\" (UniqueName: \"kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800150 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800174 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrqg\" (UniqueName: \"kubernetes.io/projected/aaca7742-2ff5-4b80-9ea2-ed94aa869684-kube-api-access-jkrqg\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800206 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmx7\" (UniqueName: \"kubernetes.io/projected/cf90e4e1-8fba-4df9-8e30-b392921d4d16-kube-api-access-hfmx7\") pod \"downloads-7954f5f757-7szjl\" (UID: \"cf90e4e1-8fba-4df9-8e30-b392921d4d16\") " pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800227 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf5l\" (UniqueName: \"kubernetes.io/projected/1022948e-9743-4b7f-9e25-2e2b9070789c-kube-api-access-9kf5l\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800247 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-service-ca\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800267 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpc44\" (UniqueName: \"kubernetes.io/projected/aeb7e86c-90a7-48d9-a641-0814203fce0d-kube-api-access-wpc44\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800293 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2lk\" (UniqueName: \"kubernetes.io/projected/bef8a518-6665-402b-98f1-81b5db29d4ed-kube-api-access-sh2lk\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800312 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800345 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60f516db-f145-4aeb-8bdf-d7e4445af01b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800366 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-serving-cert\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800388 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800561 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800604 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800661 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1022948e-9743-4b7f-9e25-2e2b9070789c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800728 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800753 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800783 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-image-import-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800732 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800816 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-config\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800870 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-config\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800894 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-config\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800933 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800954 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.800971 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4qx\" (UniqueName: \"kubernetes.io/projected/9f13654a-0829-467a-8faa-8cbba4049aca-kube-api-access-vt4qx\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801036 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801053 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801119 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22gl\" (UniqueName: \"kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801134 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4lk4\" (UniqueName: \"kubernetes.io/projected/28d61801-55c0-4e08-99b9-0a6b3d16fe71-kube-api-access-q4lk4\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801190 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aff5260-35ea-4648-af32-33699d9118c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801209 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-config\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801269 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801327 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbd861e-1c54-4ec8-beca-021062bf2924-serving-cert\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801355 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit-dir\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801390 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87lm\" (UniqueName: \"kubernetes.io/projected/a81eef7a-7f29-4f79-863e-c3e009b56ad8-kube-api-access-b87lm\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801414 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801445 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801513 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84d2\" (UniqueName: \"kubernetes.io/projected/68816dca-2483-4789-9db6-614582f6c45a-kube-api-access-b84d2\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801563 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-config\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801614 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-config\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801646 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-client\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801669 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6fv\" (UniqueName: \"kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801688 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801714 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/088f8776-75dd-4adc-9647-6a03e411313e-trusted-ca\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801829 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-auth-proxy-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801852 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-trusted-ca\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801879 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dkg2\" (UniqueName: \"kubernetes.io/projected/631d0725-bacc-431d-82ce-6db496387d50-kube-api-access-8dkg2\") pod \"migrator-59844c95c7-kgt2j\" (UID: \"631d0725-bacc-431d-82ce-6db496387d50\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801903 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaca7742-2ff5-4b80-9ea2-ed94aa869684-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801921 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801940 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801964 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z79fp\" (UniqueName: \"kubernetes.io/projected/fc451e0b-ed99-4138-8e62-01d91d2c914f-kube-api-access-z79fp\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.801989 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802012 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/088f8776-75dd-4adc-9647-6a03e411313e-metrics-tls\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802045 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-serving-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802079 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-encryption-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802102 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-config\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802126 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-service-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802147 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802241 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g84nh\" (UniqueName: \"kubernetes.io/projected/9cbd861e-1c54-4ec8-beca-021062bf2924-kube-api-access-g84nh\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802471 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-oauth-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802649 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vkr\" (UniqueName: \"kubernetes.io/projected/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-kube-api-access-m4vkr\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802741 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802887 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.802954 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg729\" (UniqueName: \"kubernetes.io/projected/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-kube-api-access-pg729\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803020 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f13654a-0829-467a-8faa-8cbba4049aca-metrics-tls\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803122 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-oauth-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803226 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aff5260-35ea-4648-af32-33699d9118c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803320 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-service-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803414 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd2kk\" (UniqueName: \"kubernetes.io/projected/b736ffe7-8ba8-4d20-8831-37d44f8d63de-kube-api-access-sd2kk\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803524 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803618 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68816dca-2483-4789-9db6-614582f6c45a-signing-cabundle\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.803909 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn86r\" (UniqueName: \"kubernetes.io/projected/60f516db-f145-4aeb-8bdf-d7e4445af01b-kube-api-access-xn86r\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804000 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmlgr\" (UniqueName: \"kubernetes.io/projected/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-kube-api-access-pmlgr\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804114 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804212 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cbd861e-1c54-4ec8-beca-021062bf2924-serving-cert\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804170 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-service-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804345 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804437 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwtd\" (UniqueName: \"kubernetes.io/projected/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-kube-api-access-wpwtd\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804545 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804619 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804689 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804759 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef8a518-6665-402b-98f1-81b5db29d4ed-config\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804831 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1022948e-9743-4b7f-9e25-2e2b9070789c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804900 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804971 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805057 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805149 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805215 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805300 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805383 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-machine-approver-tls\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804551 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1022948e-9743-4b7f-9e25-2e2b9070789c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805449 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/348fca40-a376-4560-a301-81c5d7dc93dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805494 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef8a518-6665-402b-98f1-81b5db29d4ed-serving-cert\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805518 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6cg\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-kube-api-access-8t6cg\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805536 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805554 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-webhook-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805569 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805582 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-trusted-ca-bundle\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805596 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-node-pullsecrets\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805610 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805220 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.805701 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1022948e-9743-4b7f-9e25-2e2b9070789c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.804404 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.806308 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cbd861e-1c54-4ec8-beca-021062bf2924-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.808682 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.827833 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.855829 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.868118 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906256 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906307 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906335 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906358 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906382 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906403 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906424 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwvx\" (UniqueName: \"kubernetes.io/projected/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-kube-api-access-kbwvx\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906449 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs7d2\" (UniqueName: \"kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906474 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb7e86c-90a7-48d9-a641-0814203fce0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906499 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-client\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906520 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68816dca-2483-4789-9db6-614582f6c45a-signing-key\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906551 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906578 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrqg\" (UniqueName: \"kubernetes.io/projected/aaca7742-2ff5-4b80-9ea2-ed94aa869684-kube-api-access-jkrqg\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906607 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-service-ca\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906630 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpc44\" (UniqueName: \"kubernetes.io/projected/aeb7e86c-90a7-48d9-a641-0814203fce0d-kube-api-access-wpc44\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906673 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2lk\" (UniqueName: \"kubernetes.io/projected/bef8a518-6665-402b-98f1-81b5db29d4ed-kube-api-access-sh2lk\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906695 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906729 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-serving-cert\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906754 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906774 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60f516db-f145-4aeb-8bdf-d7e4445af01b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906799 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906820 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906843 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906870 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-image-import-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906894 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-config\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906926 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906961 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.906989 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4qx\" (UniqueName: \"kubernetes.io/projected/9f13654a-0829-467a-8faa-8cbba4049aca-kube-api-access-vt4qx\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907017 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907074 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907107 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4lk4\" (UniqueName: \"kubernetes.io/projected/28d61801-55c0-4e08-99b9-0a6b3d16fe71-kube-api-access-q4lk4\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907172 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aff5260-35ea-4648-af32-33699d9118c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907209 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22gl\" (UniqueName: \"kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907287 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit-dir\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907362 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-config\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907446 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907540 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907609 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b87lm\" (UniqueName: \"kubernetes.io/projected/a81eef7a-7f29-4f79-863e-c3e009b56ad8-kube-api-access-b87lm\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907633 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b84d2\" (UniqueName: \"kubernetes.io/projected/68816dca-2483-4789-9db6-614582f6c45a-kube-api-access-b84d2\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907654 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-config\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907677 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-client\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907701 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6fv\" (UniqueName: \"kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907723 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907749 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-auth-proxy-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907771 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/088f8776-75dd-4adc-9647-6a03e411313e-trusted-ca\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907795 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-trusted-ca\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907827 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dkg2\" (UniqueName: \"kubernetes.io/projected/631d0725-bacc-431d-82ce-6db496387d50-kube-api-access-8dkg2\") pod \"migrator-59844c95c7-kgt2j\" (UID: \"631d0725-bacc-431d-82ce-6db496387d50\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907852 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907877 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907902 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z79fp\" (UniqueName: \"kubernetes.io/projected/fc451e0b-ed99-4138-8e62-01d91d2c914f-kube-api-access-z79fp\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907930 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaca7742-2ff5-4b80-9ea2-ed94aa869684-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907954 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.907978 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/088f8776-75dd-4adc-9647-6a03e411313e-metrics-tls\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908006 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-config\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908026 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-service-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908081 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908106 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-serving-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908126 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-encryption-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908157 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-oauth-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908182 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vkr\" (UniqueName: \"kubernetes.io/projected/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-kube-api-access-m4vkr\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908215 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908235 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg729\" (UniqueName: \"kubernetes.io/projected/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-kube-api-access-pg729\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908255 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908268 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908289 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aff5260-35ea-4648-af32-33699d9118c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908313 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f13654a-0829-467a-8faa-8cbba4049aca-metrics-tls\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908337 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-oauth-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908371 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908394 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd2kk\" (UniqueName: \"kubernetes.io/projected/b736ffe7-8ba8-4d20-8831-37d44f8d63de-kube-api-access-sd2kk\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908416 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68816dca-2483-4789-9db6-614582f6c45a-signing-cabundle\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908436 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmlgr\" (UniqueName: \"kubernetes.io/projected/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-kube-api-access-pmlgr\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908458 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908528 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn86r\" (UniqueName: \"kubernetes.io/projected/60f516db-f145-4aeb-8bdf-d7e4445af01b-kube-api-access-xn86r\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908563 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwtd\" (UniqueName: \"kubernetes.io/projected/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-kube-api-access-wpwtd\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908587 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908612 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908635 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef8a518-6665-402b-98f1-81b5db29d4ed-config\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908657 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908679 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908701 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908724 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908762 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908782 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908804 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-machine-approver-tls\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908828 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/348fca40-a376-4560-a301-81c5d7dc93dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908855 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908859 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.908984 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef8a518-6665-402b-98f1-81b5db29d4ed-serving-cert\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909225 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit-dir\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909362 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6cg\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-kube-api-access-8t6cg\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909409 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-trusted-ca-bundle\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909446 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-node-pullsecrets\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909477 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909502 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-webhook-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909539 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909569 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909596 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909626 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-srv-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909673 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r2q5\" (UniqueName: \"kubernetes.io/projected/348fca40-a376-4560-a301-81c5d7dc93dd-kube-api-access-4r2q5\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909710 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909740 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-serving-cert\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909768 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909797 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-serving-cert\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909824 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909859 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909900 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9888\" (UniqueName: \"kubernetes.io/projected/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-kube-api-access-w9888\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909933 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxrm\" (UniqueName: \"kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909961 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-tmpfs\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.909998 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj599\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-kube-api-access-rj599\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910028 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910055 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910105 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvf8\" (UniqueName: \"kubernetes.io/projected/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-kube-api-access-9fvf8\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910147 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910216 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-serving-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.910729 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-config\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.911133 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-trusted-ca\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.911154 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.911164 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8aff5260-35ea-4648-af32-33699d9118c3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.911269 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.912044 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.912295 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.912417 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.914170 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.915216 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-config\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.915400 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/68816dca-2483-4789-9db6-614582f6c45a-signing-key\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.915555 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/60f516db-f145-4aeb-8bdf-d7e4445af01b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.915554 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916252 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916279 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-etcd-client\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916481 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916668 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-serving-cert\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916687 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-encryption-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.916998 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.917225 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.918446 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.920172 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef8a518-6665-402b-98f1-81b5db29d4ed-config\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.920820 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-tmpfs\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.921170 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-config\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.921423 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.922552 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-audit\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.922705 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a81eef7a-7f29-4f79-863e-c3e009b56ad8-image-import-ca\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923181 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923268 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a81eef7a-7f29-4f79-863e-c3e009b56ad8-node-pullsecrets\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923459 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/68816dca-2483-4789-9db6-614582f6c45a-signing-cabundle\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923530 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923840 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.923964 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.924657 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-config\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.927981 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.929792 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-serving-cert\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.929877 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930304 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930447 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930477 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/348fca40-a376-4560-a301-81c5d7dc93dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930542 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-images\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930596 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8aff5260-35ea-4648-af32-33699d9118c3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.930707 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.931097 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a81eef7a-7f29-4f79-863e-c3e009b56ad8-serving-cert\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.931443 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.932627 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-webhook-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.933461 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.934184 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.935319 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f13654a-0829-467a-8faa-8cbba4049aca-metrics-tls\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.935590 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef8a518-6665-402b-98f1-81b5db29d4ed-serving-cert\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.935600 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.935748 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.948392 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.951632 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154d567d-eccf-4771-a5e0-60b2375d3e8b-config\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.968662 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.987937 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 11:04:43 crc kubenswrapper[4961]: I0120 11:04:43.993268 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/154d567d-eccf-4771-a5e0-60b2375d3e8b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.008747 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.041430 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.053360 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.071901 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.088430 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.095970 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-machine-approver-tls\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.108845 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.119651 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-auth-proxy-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.128870 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.138416 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-config\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.149145 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.169222 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.195829 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.199917 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/088f8776-75dd-4adc-9647-6a03e411313e-trusted-ca\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.208972 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.229166 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.242780 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aaca7742-2ff5-4b80-9ea2-ed94aa869684-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.248030 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.268974 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.288706 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.293743 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/088f8776-75dd-4adc-9647-6a03e411313e-metrics-tls\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.309558 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.329235 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.349454 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.368949 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.373506 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-serving-cert\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.387759 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.403462 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-client\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.409222 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.428458 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.433492 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-config\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.448997 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.452633 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.469484 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.471369 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/28d61801-55c0-4e08-99b9-0a6b3d16fe71-etcd-service-ca\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.488581 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.508756 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.521377 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aeb7e86c-90a7-48d9-a641-0814203fce0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.528866 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.538268 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.548920 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.569003 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.588309 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.600576 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.608696 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.610707 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.628624 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.633037 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.649027 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.668441 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.670654 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-service-ca\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.689019 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.710130 4961 request.go:700] Waited for 1.013515818s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Dconsole-serving-cert&limit=500&resourceVersion=0 Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.712182 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.723262 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.728721 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.733363 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-oauth-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.765186 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.766815 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-trusted-ca-bundle\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.768975 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.774232 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-oauth-serving-cert\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.788282 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.794009 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc451e0b-ed99-4138-8e62-01d91d2c914f-console-config\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.826441 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24lw\" (UniqueName: \"kubernetes.io/projected/9f0a10d3-171e-4695-b66e-870bb63e5712-kube-api-access-x24lw\") pod \"apiserver-7bbb656c7d-dxxsg\" (UID: \"9f0a10d3-171e-4695-b66e-870bb63e5712\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.835023 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.848845 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.869421 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.888948 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.900717 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b736ffe7-8ba8-4d20-8831-37d44f8d63de-srv-cert\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.908968 4961 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.909048 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle podName:d5ac0979-0fc9-48a6-8d22-6ba2c646287a nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.409029446 +0000 UTC m=+38.193529317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle") pod "router-default-5444994796-6xx2v" (UID: "d5ac0979-0fc9-48a6-8d22-6ba2c646287a") : failed to sync configmap cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.909301 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.909385 4961 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.909428 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth podName:d5ac0979-0fc9-48a6-8d22-6ba2c646287a nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.409418536 +0000 UTC m=+38.193918407 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth") pod "router-default-5444994796-6xx2v" (UID: "d5ac0979-0fc9-48a6-8d22-6ba2c646287a") : failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.910728 4961 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.910878 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate podName:d5ac0979-0fc9-48a6-8d22-6ba2c646287a nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.410843989 +0000 UTC m=+38.195344030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate") pod "router-default-5444994796-6xx2v" (UID: "d5ac0979-0fc9-48a6-8d22-6ba2c646287a") : failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.922322 4961 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.922443 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls podName:60f516db-f145-4aeb-8bdf-d7e4445af01b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.422415973 +0000 UTC m=+38.206915844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls") pod "machine-config-controller-84d6567774-vbbd6" (UID: "60f516db-f145-4aeb-8bdf-d7e4445af01b") : failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.922542 4961 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.922752 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist podName:68d3d73a-7ef8-49ee-ae94-2d73115e126e nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.422722661 +0000 UTC m=+38.207222532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-cczpb" (UID: "68d3d73a-7ef8-49ee-ae94-2d73115e126e") : failed to sync configmap cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.924389 4961 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: E0120 11:04:44.924538 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs podName:d5ac0979-0fc9-48a6-8d22-6ba2c646287a nodeName:}" failed. No retries permitted until 2026-01-20 11:04:45.424506183 +0000 UTC m=+38.209006254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs") pod "router-default-5444994796-6xx2v" (UID: "d5ac0979-0fc9-48a6-8d22-6ba2c646287a") : failed to sync secret cache: timed out waiting for the condition Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.928113 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.948949 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.968419 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 11:04:44 crc kubenswrapper[4961]: I0120 11:04:44.991767 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.007843 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.033526 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.035135 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.048208 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.068321 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.090107 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.108489 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.127940 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.148635 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.168443 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.187818 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.209542 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.233774 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.240008 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg"] Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.248345 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 11:04:45 crc kubenswrapper[4961]: W0120 11:04:45.253275 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0a10d3_171e_4695_b66e_870bb63e5712.slice/crio-25cf7bb3572d7e7d8b518781df1e9e5f7e29e1c537b5e50da430f59fa1ea97cd WatchSource:0}: Error finding container 25cf7bb3572d7e7d8b518781df1e9e5f7e29e1c537b5e50da430f59fa1ea97cd: Status 404 returned error can't find the container with id 25cf7bb3572d7e7d8b518781df1e9e5f7e29e1c537b5e50da430f59fa1ea97cd Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.268033 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.288718 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.309242 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.329379 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.348954 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.369612 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.389380 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.409719 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.428834 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440096 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440306 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440394 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440541 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440726 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.440823 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.441607 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-service-ca-bundle\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.444684 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60f516db-f145-4aeb-8bdf-d7e4445af01b-proxy-tls\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.445051 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-default-certificate\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.445643 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-metrics-certs\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.446049 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-stats-auth\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.449128 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.468980 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.488142 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.509908 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.529574 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.548806 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.568800 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.571899 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.589078 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.609047 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.628982 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.649629 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.669147 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.689344 4961 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.726782 4961 request.go:700] Waited for 1.926944055s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.750165 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpzzq\" (UniqueName: \"kubernetes.io/projected/154d567d-eccf-4771-a5e0-60b2375d3e8b-kube-api-access-xpzzq\") pod \"machine-api-operator-5694c8668f-fw9fs\" (UID: \"154d567d-eccf-4771-a5e0-60b2375d3e8b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.771303 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99ff\" (UniqueName: \"kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff\") pod \"controller-manager-879f6c89f-dj2tn\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.790661 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmx7\" (UniqueName: \"kubernetes.io/projected/cf90e4e1-8fba-4df9-8e30-b392921d4d16-kube-api-access-hfmx7\") pod \"downloads-7954f5f757-7szjl\" (UID: \"cf90e4e1-8fba-4df9-8e30-b392921d4d16\") " pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.808530 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf5l\" (UniqueName: \"kubernetes.io/projected/1022948e-9743-4b7f-9e25-2e2b9070789c-kube-api-access-9kf5l\") pod \"openshift-apiserver-operator-796bbdcf4f-wjmbx\" (UID: \"1022948e-9743-4b7f-9e25-2e2b9070789c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.824891 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g84nh\" (UniqueName: \"kubernetes.io/projected/9cbd861e-1c54-4ec8-beca-021062bf2924-kube-api-access-g84nh\") pod \"authentication-operator-69f744f599-fk28r\" (UID: \"9cbd861e-1c54-4ec8-beca-021062bf2924\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.863789 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" event={"ID":"9f0a10d3-171e-4695-b66e-870bb63e5712","Type":"ContainerStarted","Data":"25cf7bb3572d7e7d8b518781df1e9e5f7e29e1c537b5e50da430f59fa1ea97cd"} Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.864388 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrqg\" (UniqueName: \"kubernetes.io/projected/aaca7742-2ff5-4b80-9ea2-ed94aa869684-kube-api-access-jkrqg\") pod \"cluster-samples-operator-665b6dd947-2tk9p\" (UID: \"aaca7742-2ff5-4b80-9ea2-ed94aa869684\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.884552 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs7d2\" (UniqueName: \"kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2\") pod \"cni-sysctl-allowlist-ds-cczpb\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.903127 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwvx\" (UniqueName: \"kubernetes.io/projected/aa2b6ee6-f793-43c1-bde1-92e1d8e67754-kube-api-access-kbwvx\") pod \"machine-approver-56656f9798-xkbs6\" (UID: \"aa2b6ee6-f793-43c1-bde1-92e1d8e67754\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.923734 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z79fp\" (UniqueName: \"kubernetes.io/projected/fc451e0b-ed99-4138-8e62-01d91d2c914f-kube-api-access-z79fp\") pod \"console-f9d7485db-2cs6z\" (UID: \"fc451e0b-ed99-4138-8e62-01d91d2c914f\") " pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.943244 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4qx\" (UniqueName: \"kubernetes.io/projected/9f13654a-0829-467a-8faa-8cbba4049aca-kube-api-access-vt4qx\") pod \"dns-operator-744455d44c-v7tsh\" (UID: \"9f13654a-0829-467a-8faa-8cbba4049aca\") " pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.949427 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.957724 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.963984 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/080291d9-3dec-4cc7-aeff-f9dbdb7abb68-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dhf59\" (UID: \"080291d9-3dec-4cc7-aeff-f9dbdb7abb68\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.964305 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" Jan 20 11:04:45 crc kubenswrapper[4961]: W0120 11:04:45.981689 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2b6ee6_f793_43c1_bde1_92e1d8e67754.slice/crio-2c151e959d14e01e9050c7494f2701e10129bc2b77124add106ff5d4c5f6a094 WatchSource:0}: Error finding container 2c151e959d14e01e9050c7494f2701e10129bc2b77124add106ff5d4c5f6a094: Status 404 returned error can't find the container with id 2c151e959d14e01e9050c7494f2701e10129bc2b77124add106ff5d4c5f6a094 Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.983563 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.990109 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22gl\" (UniqueName: \"kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl\") pod \"collect-profiles-29481780-rzljm\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:45 crc kubenswrapper[4961]: I0120 11:04:45.997832 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.004144 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.008776 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpc44\" (UniqueName: \"kubernetes.io/projected/aeb7e86c-90a7-48d9-a641-0814203fce0d-kube-api-access-wpc44\") pod \"control-plane-machine-set-operator-78cbb6b69f-fl5hs\" (UID: \"aeb7e86c-90a7-48d9-a641-0814203fce0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.010967 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.017510 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.024302 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2lk\" (UniqueName: \"kubernetes.io/projected/bef8a518-6665-402b-98f1-81b5db29d4ed-kube-api-access-sh2lk\") pod \"service-ca-operator-777779d784-t464w\" (UID: \"bef8a518-6665-402b-98f1-81b5db29d4ed\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.046776 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4lk4\" (UniqueName: \"kubernetes.io/projected/28d61801-55c0-4e08-99b9-0a6b3d16fe71-kube-api-access-q4lk4\") pod \"etcd-operator-b45778765-jq4wg\" (UID: \"28d61801-55c0-4e08-99b9-0a6b3d16fe71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.069525 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vkr\" (UniqueName: \"kubernetes.io/projected/d5ac0979-0fc9-48a6-8d22-6ba2c646287a-kube-api-access-m4vkr\") pod \"router-default-5444994796-6xx2v\" (UID: \"d5ac0979-0fc9-48a6-8d22-6ba2c646287a\") " pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.089038 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg729\" (UniqueName: \"kubernetes.io/projected/7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58-kube-api-access-pg729\") pod \"console-operator-58897d9998-tz72b\" (UID: \"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58\") " pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.094598 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.095684 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.113022 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87lm\" (UniqueName: \"kubernetes.io/projected/a81eef7a-7f29-4f79-863e-c3e009b56ad8-kube-api-access-b87lm\") pod \"apiserver-76f77b778f-7r52t\" (UID: \"a81eef7a-7f29-4f79-863e-c3e009b56ad8\") " pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.129577 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b84d2\" (UniqueName: \"kubernetes.io/projected/68816dca-2483-4789-9db6-614582f6c45a-kube-api-access-b84d2\") pod \"service-ca-9c57cc56f-l62cr\" (UID: \"68816dca-2483-4789-9db6-614582f6c45a\") " pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.153935 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fw9fs"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.157683 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.164104 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dkg2\" (UniqueName: \"kubernetes.io/projected/631d0725-bacc-431d-82ce-6db496387d50-kube-api-access-8dkg2\") pod \"migrator-59844c95c7-kgt2j\" (UID: \"631d0725-bacc-431d-82ce-6db496387d50\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.167980 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.171608 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9888\" (UniqueName: \"kubernetes.io/projected/9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9-kube-api-access-w9888\") pod \"packageserver-d55dfcdfc-8cwsq\" (UID: \"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.174214 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.174769 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.180661 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.187736 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj599\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-kube-api-access-rj599\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.192749 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.203845 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.215477 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxrm\" (UniqueName: \"kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm\") pod \"oauth-openshift-558db77b4-jjg29\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.219403 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.232295 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/088f8776-75dd-4adc-9647-6a03e411313e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b58ws\" (UID: \"088f8776-75dd-4adc-9647-6a03e411313e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.234337 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.253022 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r2q5\" (UniqueName: \"kubernetes.io/projected/348fca40-a376-4560-a301-81c5d7dc93dd-kube-api-access-4r2q5\") pod \"multus-admission-controller-857f4d67dd-2vqvf\" (UID: \"348fca40-a376-4560-a301-81c5d7dc93dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.270306 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6fv\" (UniqueName: \"kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv\") pod \"route-controller-manager-6576b87f9c-86klf\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.271618 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.295768 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.296702 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.296967 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.300688 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd2kk\" (UniqueName: \"kubernetes.io/projected/b736ffe7-8ba8-4d20-8831-37d44f8d63de-kube-api-access-sd2kk\") pod \"olm-operator-6b444d44fb-5bfjx\" (UID: \"b736ffe7-8ba8-4d20-8831-37d44f8d63de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:46 crc kubenswrapper[4961]: W0120 11:04:46.303990 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d3d73a_7ef8_49ee_ae94_2d73115e126e.slice/crio-7f08d3d4d95c2a1776457b5b7921be61184652f9f827f1cdd8b7df0f01586d5e WatchSource:0}: Error finding container 7f08d3d4d95c2a1776457b5b7921be61184652f9f827f1cdd8b7df0f01586d5e: Status 404 returned error can't find the container with id 7f08d3d4d95c2a1776457b5b7921be61184652f9f827f1cdd8b7df0f01586d5e Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.321757 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvf8\" (UniqueName: \"kubernetes.io/projected/a74ca54e-cdb6-4759-9bf3-fc4a3defb11c-kube-api-access-9fvf8\") pod \"openshift-controller-manager-operator-756b6f6bc6-kdxfq\" (UID: \"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.325516 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwtd\" (UniqueName: \"kubernetes.io/projected/74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a-kube-api-access-wpwtd\") pod \"openshift-config-operator-7777fb866f-nszp2\" (UID: \"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.342234 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.346113 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn86r\" (UniqueName: \"kubernetes.io/projected/60f516db-f145-4aeb-8bdf-d7e4445af01b-kube-api-access-xn86r\") pod \"machine-config-controller-84d6567774-vbbd6\" (UID: \"60f516db-f145-4aeb-8bdf-d7e4445af01b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.388206 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2e00fdd-4780-4faa-a3c3-75b59218a5a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bwhk5\" (UID: \"f2e00fdd-4780-4faa-a3c3-75b59218a5a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.392996 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.394571 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7szjl"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.399682 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6cg\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-kube-api-access-8t6cg\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.403244 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.413672 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8aff5260-35ea-4648-af32-33699d9118c3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5rmt2\" (UID: \"8aff5260-35ea-4648-af32-33699d9118c3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.434541 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmlgr\" (UniqueName: \"kubernetes.io/projected/4e828266-647a-4cc3-9e4e-4d27d1ddbda1-kube-api-access-pmlgr\") pod \"kube-storage-version-migrator-operator-b67b599dd-m9xqq\" (UID: \"4e828266-647a-4cc3-9e4e-4d27d1ddbda1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.434753 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.434907 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.444118 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.445989 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.453298 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.456511 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.493894 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fk28r"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.494510 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2cs6z"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.511916 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.512730 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.527703 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.528586 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.541692 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.562989 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-certs\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564540 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-plugins-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564576 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2rt\" (UniqueName: \"kubernetes.io/projected/0359d20b-0121-43aa-8b56-04fc6210db6e-kube-api-access-fd2rt\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564651 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564700 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564727 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-mountpoint-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564802 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-srv-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564852 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564894 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99sr\" (UniqueName: \"kubernetes.io/projected/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-kube-api-access-f99sr\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564938 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwcwd\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564966 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-images\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.564998 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2njzn\" (UniqueName: \"kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565176 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-csi-data-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565214 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565239 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565311 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565355 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565380 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtr9c\" (UniqueName: \"kubernetes.io/projected/c819a2b6-2905-4966-aad3-55cf95ee88ef-kube-api-access-dtr9c\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565406 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lrrn\" (UniqueName: \"kubernetes.io/projected/0702836b-cde4-4ae0-9f51-b855a370c0f5-kube-api-access-9lrrn\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565453 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-socket-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565480 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0702836b-cde4-4ae0-9f51-b855a370c0f5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565507 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565550 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565573 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djlvj\" (UniqueName: \"kubernetes.io/projected/54e356e3-9033-4c55-b862-075b04e96bf2-kube-api-access-djlvj\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565642 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-node-bootstrap-token\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565666 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-proxy-tls\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.565754 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-registration-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.567636 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-config-volume\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.567707 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-metrics-tls\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.568147 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.06812454 +0000 UTC m=+39.852624631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.568582 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.568733 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c819a2b6-2905-4966-aad3-55cf95ee88ef-cert\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.569482 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.569622 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48l8\" (UniqueName: \"kubernetes.io/projected/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-kube-api-access-d48l8\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.569781 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.570553 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.570993 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.571171 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbcg\" (UniqueName: \"kubernetes.io/projected/79d8764d-52dd-4350-9bef-e079ce9ade6a-kube-api-access-8gbcg\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.676342 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.676952 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-registration-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.677088 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-config-volume\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.678441 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.178393801 +0000 UTC m=+39.962893672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.678744 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-registration-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.678996 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-metrics-tls\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.679112 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.679483 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c819a2b6-2905-4966-aad3-55cf95ee88ef-cert\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.679982 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-config-volume\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.680136 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682083 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48l8\" (UniqueName: \"kubernetes.io/projected/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-kube-api-access-d48l8\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682143 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682645 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682684 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682773 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbcg\" (UniqueName: \"kubernetes.io/projected/79d8764d-52dd-4350-9bef-e079ce9ade6a-kube-api-access-8gbcg\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682822 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-plugins-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682847 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2rt\" (UniqueName: \"kubernetes.io/projected/0359d20b-0121-43aa-8b56-04fc6210db6e-kube-api-access-fd2rt\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682877 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-certs\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682928 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.682988 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683023 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-mountpoint-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683130 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-srv-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683162 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683246 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99sr\" (UniqueName: \"kubernetes.io/projected/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-kube-api-access-f99sr\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683357 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwcwd\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683421 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2njzn\" (UniqueName: \"kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683456 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-images\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683536 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-csi-data-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683574 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683597 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683687 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683735 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683767 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtr9c\" (UniqueName: \"kubernetes.io/projected/c819a2b6-2905-4966-aad3-55cf95ee88ef-kube-api-access-dtr9c\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683908 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lrrn\" (UniqueName: \"kubernetes.io/projected/0702836b-cde4-4ae0-9f51-b855a370c0f5-kube-api-access-9lrrn\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.683971 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-socket-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684002 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0702836b-cde4-4ae0-9f51-b855a370c0f5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684036 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684076 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djlvj\" (UniqueName: \"kubernetes.io/projected/54e356e3-9033-4c55-b862-075b04e96bf2-kube-api-access-djlvj\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684586 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684630 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-node-bootstrap-token\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.684664 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-proxy-tls\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.688582 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-plugins-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.694713 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.694856 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.695105 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v7tsh"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.695860 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.696233 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.696507 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-mountpoint-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.696949 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c819a2b6-2905-4966-aad3-55cf95ee88ef-cert\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.697034 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-csi-data-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.698213 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-certs\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.698556 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.698610 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.698814 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.198794164 +0000 UTC m=+39.983294035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.699021 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-proxy-tls\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.699607 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-images\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.700334 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.700735 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0359d20b-0121-43aa-8b56-04fc6210db6e-socket-dir\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.701439 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-metrics-tls\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.702660 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79d8764d-52dd-4350-9bef-e079ce9ade6a-srv-cert\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.703244 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0702836b-cde4-4ae0-9f51-b855a370c0f5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.703584 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/54e356e3-9033-4c55-b862-075b04e96bf2-node-bootstrap-token\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.707373 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.708379 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.709511 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.710784 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.728513 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwcwd\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.744110 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lrrn\" (UniqueName: \"kubernetes.io/projected/0702836b-cde4-4ae0-9f51-b855a370c0f5-kube-api-access-9lrrn\") pod \"package-server-manager-789f6589d5-zm7fw\" (UID: \"0702836b-cde4-4ae0-9f51-b855a370c0f5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.773092 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6088e2cb-5b23-44f4-87a5-af1d5f36bca3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6lk7p\" (UID: \"6088e2cb-5b23-44f4-87a5-af1d5f36bca3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.785615 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.786338 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.286175624 +0000 UTC m=+40.070675495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.820058 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7r52t"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.847154 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz72b"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.859287 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm"] Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.888221 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.888795 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.388770353 +0000 UTC m=+40.173270224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.903980 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" event={"ID":"78a4572e-93b5-40eb-b11c-98a39f3c6a5b","Type":"ContainerStarted","Data":"5b2381336084ee6b4c58ab7166919218759af4081282f2f056d26b6af7593249"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.907244 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7szjl" event={"ID":"cf90e4e1-8fba-4df9-8e30-b392921d4d16","Type":"ContainerStarted","Data":"5af670c416e3b56b88010df28b1fb6a4cb9442a3d12ded6a816a3033fb7d60d8"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.913915 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" event={"ID":"154d567d-eccf-4771-a5e0-60b2375d3e8b","Type":"ContainerStarted","Data":"4db3ca9bc4791f04023c2f572996d7facd49e710e40608a9baac50f5abdc4ae2"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.917448 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbcg\" (UniqueName: \"kubernetes.io/projected/79d8764d-52dd-4350-9bef-e079ce9ade6a-kube-api-access-8gbcg\") pod \"catalog-operator-68c6474976-jmdhz\" (UID: \"79d8764d-52dd-4350-9bef-e079ce9ade6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.917448 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djlvj\" (UniqueName: \"kubernetes.io/projected/54e356e3-9033-4c55-b862-075b04e96bf2-kube-api-access-djlvj\") pod \"machine-config-server-dghc2\" (UID: \"54e356e3-9033-4c55-b862-075b04e96bf2\") " pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.917924 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2njzn\" (UniqueName: \"kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn\") pod \"marketplace-operator-79b997595-bslr7\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.919685 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2rt\" (UniqueName: \"kubernetes.io/projected/0359d20b-0121-43aa-8b56-04fc6210db6e-kube-api-access-fd2rt\") pod \"csi-hostpathplugin-xd5gg\" (UID: \"0359d20b-0121-43aa-8b56-04fc6210db6e\") " pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.921272 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99sr\" (UniqueName: \"kubernetes.io/projected/4e3fb6d8-549c-408c-b3bb-e8080a7b45f3-kube-api-access-f99sr\") pod \"dns-default-44twq\" (UID: \"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3\") " pod="openshift-dns/dns-default-44twq" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.921399 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48l8\" (UniqueName: \"kubernetes.io/projected/4bab7374-a3b1-48d3-97f3-1b3e63392ff3-kube-api-access-d48l8\") pod \"machine-config-operator-74547568cd-ngbt8\" (UID: \"4bab7374-a3b1-48d3-97f3-1b3e63392ff3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.921746 4961 generic.go:334] "Generic (PLEG): container finished" podID="9f0a10d3-171e-4695-b66e-870bb63e5712" containerID="5566d1dfce80e5e5c923471d4676d6492934e86bb59500f00b58b0e3d43b1901" exitCode=0 Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.921916 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" event={"ID":"9f0a10d3-171e-4695-b66e-870bb63e5712","Type":"ContainerDied","Data":"5566d1dfce80e5e5c923471d4676d6492934e86bb59500f00b58b0e3d43b1901"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.923626 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtr9c\" (UniqueName: \"kubernetes.io/projected/c819a2b6-2905-4966-aad3-55cf95ee88ef-kube-api-access-dtr9c\") pod \"ingress-canary-x6vpd\" (UID: \"c819a2b6-2905-4966-aad3-55cf95ee88ef\") " pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.924129 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xx2v" event={"ID":"d5ac0979-0fc9-48a6-8d22-6ba2c646287a","Type":"ContainerStarted","Data":"9e2f73283525e5db9a51cf641cb6d22a9d41954d1d65b7caef40dc62f8a83ed2"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.926044 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" event={"ID":"9cbd861e-1c54-4ec8-beca-021062bf2924","Type":"ContainerStarted","Data":"0d4fc695e90442a295b0df374af80b2504aa8558477f21a5b5a3c692dcdb59f7"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.930226 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" event={"ID":"1022948e-9743-4b7f-9e25-2e2b9070789c","Type":"ContainerStarted","Data":"695ef4ed50e9d18579c4bfa356121547ac48c36a60565e9091d4cc1f1bb35350"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.932861 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.933944 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" event={"ID":"aa2b6ee6-f793-43c1-bde1-92e1d8e67754","Type":"ContainerStarted","Data":"4720990698c8ef72b74e6039e92ccc240e2a606954e4892e52963f404fb50edc"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.933989 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" event={"ID":"aa2b6ee6-f793-43c1-bde1-92e1d8e67754","Type":"ContainerStarted","Data":"2c151e959d14e01e9050c7494f2701e10129bc2b77124add106ff5d4c5f6a094"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.944347 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.953582 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" event={"ID":"68d3d73a-7ef8-49ee-ae94-2d73115e126e","Type":"ContainerStarted","Data":"7f08d3d4d95c2a1776457b5b7921be61184652f9f827f1cdd8b7df0f01586d5e"} Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.956616 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.967725 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.973184 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.989245 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" Jan 20 11:04:46 crc kubenswrapper[4961]: I0120 11:04:46.989488 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:46 crc kubenswrapper[4961]: E0120 11:04:46.990456 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.49043422 +0000 UTC m=+40.274934281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:46 crc kubenswrapper[4961]: W0120 11:04:46.994491 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8f8f6e_bb01_49c6_864a_9a98a57abea8.slice/crio-cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06 WatchSource:0}: Error finding container cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06: Status 404 returned error can't find the container with id cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06 Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.046097 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x6vpd" Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.048251 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dghc2" Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.065269 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.086267 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-44twq" Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.092823 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.094031 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.594010212 +0000 UTC m=+40.378510083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.111622 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.166895 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jjg29"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.194157 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.194894 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.210903 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.710844379 +0000 UTC m=+40.495344250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.232770 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c616c0-5852-4a0c-98e7-7d6af398ed2e-metrics-certs\") pod \"network-metrics-daemon-cpvtl\" (UID: \"f4c616c0-5852-4a0c-98e7-7d6af398ed2e\") " pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:47 crc kubenswrapper[4961]: W0120 11:04:47.233388 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f20be1_6fa3_47ce_ac42_6d9a618ae151.slice/crio-8f6b3e2f3e67fb812a02bf97ff754919949fc30ff38821f046abec0d4c6d81df WatchSource:0}: Error finding container 8f6b3e2f3e67fb812a02bf97ff754919949fc30ff38821f046abec0d4c6d81df: Status 404 returned error can't find the container with id 8f6b3e2f3e67fb812a02bf97ff754919949fc30ff38821f046abec0d4c6d81df Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.252457 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cpvtl" Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.300402 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-l62cr"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.311273 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.311807 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.811789549 +0000 UTC m=+40.596289420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.346405 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t464w"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.368142 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.370400 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.374040 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jq4wg"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.412607 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.413748 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:47.913708422 +0000 UTC m=+40.698208313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.514784 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.520233 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.020180013 +0000 UTC m=+40.804679884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: W0120 11:04:47.561273 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc6ab6c_6aaa_4c35_bbbe_24448f57cbf9.slice/crio-b3c3189b43756943f0fa642a515342e6b06c07fd69f3632388693ff001b00749 WatchSource:0}: Error finding container b3c3189b43756943f0fa642a515342e6b06c07fd69f3632388693ff001b00749: Status 404 returned error can't find the container with id b3c3189b43756943f0fa642a515342e6b06c07fd69f3632388693ff001b00749 Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.566638 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.622012 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.626859 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.126830979 +0000 UTC m=+40.911330840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.628024 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.635329 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.135303629 +0000 UTC m=+40.919803500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: W0120 11:04:47.707518 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeb7e86c_90a7_48d9_a641_0814203fce0d.slice/crio-3dd50122b4b09a0944af2c9cab271a06db5952f08482b9e2b485f12228151801 WatchSource:0}: Error finding container 3dd50122b4b09a0944af2c9cab271a06db5952f08482b9e2b485f12228151801: Status 404 returned error can't find the container with id 3dd50122b4b09a0944af2c9cab271a06db5952f08482b9e2b485f12228151801 Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.731262 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.731829 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.231804994 +0000 UTC m=+41.016304865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.826153 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws"] Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.835266 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.835797 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.335780056 +0000 UTC m=+41.120279927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.937111 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.937866 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.437837883 +0000 UTC m=+41.222337754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.938101 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:47 crc kubenswrapper[4961]: E0120 11:04:47.938782 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.438757215 +0000 UTC m=+41.223257086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.980350 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" event={"ID":"68816dca-2483-4789-9db6-614582f6c45a","Type":"ContainerStarted","Data":"3d2f7d1235ba3b9dd27495f346b7ce183a0865e605e8249f11c716affdc784ec"} Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.988180 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" event={"ID":"82cc64f1-0377-43e9-94a0-213d82b4a415","Type":"ContainerStarted","Data":"63396020482cc7e372d5186d1bc2d5a75a304d516d3ba361c67111c453367d65"} Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.994092 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tz72b" event={"ID":"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58","Type":"ContainerStarted","Data":"4529dbe27e09d202b4579290755d1b6f84a02a469a679b166f4c9e9e6b894b00"} Jan 20 11:04:47 crc kubenswrapper[4961]: I0120 11:04:47.994161 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tz72b" event={"ID":"7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58","Type":"ContainerStarted","Data":"d4803ad50a15c195a908353771feabeb3e6b1ebfa3d33ecd35930493b902871c"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.001377 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.010410 4961 patch_prober.go:28] interesting pod/console-operator-58897d9998-tz72b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.010491 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tz72b" podUID="7d82cac5-4ce6-4fd1-b3c2-d7803a4b0e58" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.041305 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.042306 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.542279145 +0000 UTC m=+41.326779016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.042566 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.044605 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.544585609 +0000 UTC m=+41.329085480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.104231 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7szjl" event={"ID":"cf90e4e1-8fba-4df9-8e30-b392921d4d16","Type":"ContainerStarted","Data":"9051cbeee94b056bafae53937fbe39b4cccc68ed1dbb93a2e6b300cf5ceecd09"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.109120 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.110936 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.113337 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-7szjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.113415 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7szjl" podUID="cf90e4e1-8fba-4df9-8e30-b392921d4d16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.113951 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" event={"ID":"28d61801-55c0-4e08-99b9-0a6b3d16fe71","Type":"ContainerStarted","Data":"b7f7d1d79aa5300fd61894bd54d5181ebbbe0b67704bfb433ef2487667e73d2c"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.136386 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.143767 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.145278 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.645256293 +0000 UTC m=+41.429756164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.190141 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.193991 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" event={"ID":"bef8a518-6665-402b-98f1-81b5db29d4ed","Type":"ContainerStarted","Data":"e58a8c8d2b3d1f0a190ee2d68386768906f01e110109c7c05bd06f5528d989d7"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.203148 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nszp2"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.203190 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.203202 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" event={"ID":"a81eef7a-7f29-4f79-863e-c3e009b56ad8","Type":"ContainerStarted","Data":"c264e0dbc08e5415d951e501ae993490258751c2ffb2bbb71687c208b2b9f3d5"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.207013 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2vqvf"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.212793 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.216262 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" event={"ID":"080291d9-3dec-4cc7-aeff-f9dbdb7abb68","Type":"ContainerStarted","Data":"04548f717b4115c6e85ff75a9b98d00bdde2261be8bc2124e2f6bf3b8c934797"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.218692 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" event={"ID":"9cbd861e-1c54-4ec8-beca-021062bf2924","Type":"ContainerStarted","Data":"6e15ad136584f7506bc3e19cf01ce3794c8e73437b32f724fd0b79abb362a6e8"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.222721 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" event={"ID":"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9","Type":"ContainerStarted","Data":"b3c3189b43756943f0fa642a515342e6b06c07fd69f3632388693ff001b00749"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.230018 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" event={"ID":"68d3d73a-7ef8-49ee-ae94-2d73115e126e","Type":"ContainerStarted","Data":"ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.230225 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.233216 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" event={"ID":"154d567d-eccf-4771-a5e0-60b2375d3e8b","Type":"ContainerStarted","Data":"9f45da05e7662a744acabd224564666ad16efd7d2e84a747cc91a0dd59a4ca59"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.236466 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" event={"ID":"ff8f8f6e-bb01-49c6-864a-9a98a57abea8","Type":"ContainerStarted","Data":"60b4ec1b88d62ce3dd54d4c626bb9f11e461cb14a5107a9d7133c6efcf9527da"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.236492 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" event={"ID":"ff8f8f6e-bb01-49c6-864a-9a98a57abea8","Type":"ContainerStarted","Data":"cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.247259 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.248516 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.748501168 +0000 UTC m=+41.533001039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.291965 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" event={"ID":"aaca7742-2ff5-4b80-9ea2-ed94aa869684","Type":"ContainerStarted","Data":"d441b4cb1bd4273e8ebcbb0be4633c68eba3e0241a54349fa5d8d17d24d98334"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.292947 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6xx2v" event={"ID":"d5ac0979-0fc9-48a6-8d22-6ba2c646287a","Type":"ContainerStarted","Data":"fef90d4bf5f068e221a66164d8539c760a8f6a619bd464906a159534b851347e"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.294253 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" event={"ID":"1022948e-9743-4b7f-9e25-2e2b9070789c","Type":"ContainerStarted","Data":"8246067735a5d8dbba65a937b99a5009a4c98511dab5f46e46d8d82d4adba07a"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.294420 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.308997 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.309389 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" event={"ID":"088f8776-75dd-4adc-9647-6a03e411313e","Type":"ContainerStarted","Data":"a84ccacfcb046ceb62b89df87c32c4d9a95e12750ce59770d0bee5cd2dee3bb9"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.341859 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" event={"ID":"51f20be1-6fa3-47ce-ac42-6d9a618ae151","Type":"ContainerStarted","Data":"8f6b3e2f3e67fb812a02bf97ff754919949fc30ff38821f046abec0d4c6d81df"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.342527 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.343709 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2cs6z" event={"ID":"fc451e0b-ed99-4138-8e62-01d91d2c914f","Type":"ContainerStarted","Data":"10f11c5ae1b90f0604169bc53c3ffa7fdef1238e4b93b37d7f06abfcd7fb97c9"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.343740 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2cs6z" event={"ID":"fc451e0b-ed99-4138-8e62-01d91d2c914f","Type":"ContainerStarted","Data":"4ef75619d3208eb04b72b9662257830dcbad87054257397a45b4e13f4f7cfd90"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.348056 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.348455 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.848420364 +0000 UTC m=+41.632920235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.349927 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.350488 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.850470472 +0000 UTC m=+41.634970343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: W0120 11:04:48.366292 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb736ffe7_8ba8_4d20_8831_37d44f8d63de.slice/crio-23c06fa0624571ada2479e7a0562a9ddb3d0d13cf0a2526e05a1ad08194821de WatchSource:0}: Error finding container 23c06fa0624571ada2479e7a0562a9ddb3d0d13cf0a2526e05a1ad08194821de: Status 404 returned error can't find the container with id 23c06fa0624571ada2479e7a0562a9ddb3d0d13cf0a2526e05a1ad08194821de Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.373392 4961 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-86klf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.373449 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 20 11:04:48 crc kubenswrapper[4961]: W0120 11:04:48.375663 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e828266_647a_4cc3_9e4e_4d27d1ddbda1.slice/crio-43c5b088849665bc724080765d60249a0ebea0d0240302147b579d4086847159 WatchSource:0}: Error finding container 43c5b088849665bc724080765d60249a0ebea0d0240302147b579d4086847159: Status 404 returned error can't find the container with id 43c5b088849665bc724080765d60249a0ebea0d0240302147b579d4086847159 Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.377186 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" event={"ID":"78a4572e-93b5-40eb-b11c-98a39f3c6a5b","Type":"ContainerStarted","Data":"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.377284 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.386280 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dghc2" event={"ID":"54e356e3-9033-4c55-b862-075b04e96bf2","Type":"ContainerStarted","Data":"58f64e5a84bf3bc25a94b05a9a720e514a3180555e83f514247426e141a6d08c"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.389333 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" event={"ID":"9f13654a-0829-467a-8faa-8cbba4049aca","Type":"ContainerStarted","Data":"f63db9abfc05fd5c2c809070f4e2cf8a9c0585e8b59864e24c87854bdad03b6b"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.390523 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" event={"ID":"aeb7e86c-90a7-48d9-a641-0814203fce0d","Type":"ContainerStarted","Data":"3dd50122b4b09a0944af2c9cab271a06db5952f08482b9e2b485f12228151801"} Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.399446 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.400323 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fk28r" podStartSLOduration=17.400307202 podStartE2EDuration="17.400307202s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.400109347 +0000 UTC m=+41.184609218" watchObservedRunningTime="2026-01-20 11:04:48.400307202 +0000 UTC m=+41.184807073" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.400984 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7szjl" podStartSLOduration=17.400975548 podStartE2EDuration="17.400975548s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.34066528 +0000 UTC m=+41.125165151" watchObservedRunningTime="2026-01-20 11:04:48.400975548 +0000 UTC m=+41.185475409" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.412964 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.413335 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.428597 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podStartSLOduration=5.4285770620000005 podStartE2EDuration="5.428577062s" podCreationTimestamp="2026-01-20 11:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.426750288 +0000 UTC m=+41.211250159" watchObservedRunningTime="2026-01-20 11:04:48.428577062 +0000 UTC m=+41.213076933" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.441323 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.452283 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.453754 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:48.953734417 +0000 UTC m=+41.738234288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.481599 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-44twq"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.493255 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6xx2v" podStartSLOduration=17.493223712 podStartE2EDuration="17.493223712s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.47962649 +0000 UTC m=+41.264126361" watchObservedRunningTime="2026-01-20 11:04:48.493223712 +0000 UTC m=+41.277723583" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.532076 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" podStartSLOduration=17.532031061 podStartE2EDuration="17.532031061s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.531494358 +0000 UTC m=+41.315994239" watchObservedRunningTime="2026-01-20 11:04:48.532031061 +0000 UTC m=+41.316530932" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.555676 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wjmbx" podStartSLOduration=17.555653741 podStartE2EDuration="17.555653741s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.553825137 +0000 UTC m=+41.338325008" watchObservedRunningTime="2026-01-20 11:04:48.555653741 +0000 UTC m=+41.340153612" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.556733 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.574982 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.074956938 +0000 UTC m=+41.859456809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.598954 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.676654 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tz72b" podStartSLOduration=17.676631275 podStartE2EDuration="17.676631275s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.675528389 +0000 UTC m=+41.460028260" watchObservedRunningTime="2026-01-20 11:04:48.676631275 +0000 UTC m=+41.461131146" Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.679625 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.680338 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.180311582 +0000 UTC m=+41.964811453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.754259 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.767707 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cpvtl"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.770430 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x6vpd"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.781983 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.782501 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.282483841 +0000 UTC m=+42.066983712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.784145 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.809252 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xd5gg"] Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.892874 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:48 crc kubenswrapper[4961]: E0120 11:04:48.893316 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.393290995 +0000 UTC m=+42.177790866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:48 crc kubenswrapper[4961]: I0120 11:04:48.940130 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" podStartSLOduration=17.940105974 podStartE2EDuration="17.940105974s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:48.939462708 +0000 UTC m=+41.723962579" watchObservedRunningTime="2026-01-20 11:04:48.940105974 +0000 UTC m=+41.724605865" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.001812 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.002356 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.502333587 +0000 UTC m=+42.286833458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.084752 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2cs6z" podStartSLOduration=18.084735348 podStartE2EDuration="18.084735348s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:49.050762834 +0000 UTC m=+41.835262705" watchObservedRunningTime="2026-01-20 11:04:49.084735348 +0000 UTC m=+41.869235209" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.097024 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.108456 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:49 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:49 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:49 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.108520 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.109317 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.109599 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.609585917 +0000 UTC m=+42.394085788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.211637 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.211957 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.71194496 +0000 UTC m=+42.496444831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.312631 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.313265 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.813241409 +0000 UTC m=+42.597741280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.341498 4961 csr.go:261] certificate signing request csr-bk89s is approved, waiting to be issued Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.368718 4961 csr.go:257] certificate signing request csr-bk89s is issued Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.414617 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.415558 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:49.915538721 +0000 UTC m=+42.700038592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.473487 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" event={"ID":"9f13654a-0829-467a-8faa-8cbba4049aca","Type":"ContainerStarted","Data":"2e2572fa4085ef29a86f2a35313a4d23f02443d308d07c73d3aeaea698a8ae81"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.481162 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" event={"ID":"bef8a518-6665-402b-98f1-81b5db29d4ed","Type":"ContainerStarted","Data":"336ee3e2f2511830963da2129f20b78176000109315052634a118e511822d96e"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.485142 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" event={"ID":"348fca40-a376-4560-a301-81c5d7dc93dd","Type":"ContainerStarted","Data":"cd638f7abca9314209e3bdb6f6ff446f8b08ed0c71a2a159894cdbd07b029b3a"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.510788 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" event={"ID":"f2e00fdd-4780-4faa-a3c3-75b59218a5a2","Type":"ContainerStarted","Data":"c3956b2e815c2706ebc80be87bcb51ffb07e666b49ea278aa6665f0f243f0f28"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.519624 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.520845 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.020825334 +0000 UTC m=+42.805325215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.577599 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" event={"ID":"8aff5260-35ea-4648-af32-33699d9118c3","Type":"ContainerStarted","Data":"808a160868b1a1c28e42dcff2a3e10091079fa6f02d5acd028bfe1f09d2b1301"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.577635 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" event={"ID":"8aff5260-35ea-4648-af32-33699d9118c3","Type":"ContainerStarted","Data":"56472c1d0618d393461427c59c7372b0cc0d65c2bacab2e95ead0de062c9c8bc"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.590167 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dghc2" event={"ID":"54e356e3-9033-4c55-b862-075b04e96bf2","Type":"ContainerStarted","Data":"fe9998560c13763168d297a076afe163d0daf3318fb805af6d53b08528640dc9"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.616714 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" event={"ID":"4e828266-647a-4cc3-9e4e-4d27d1ddbda1","Type":"ContainerStarted","Data":"43c5b088849665bc724080765d60249a0ebea0d0240302147b579d4086847159"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.621842 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.623692 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.123678579 +0000 UTC m=+42.908178450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.681040 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" podStartSLOduration=18.681025717 podStartE2EDuration="18.681025717s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:49.679648705 +0000 UTC m=+42.464148576" watchObservedRunningTime="2026-01-20 11:04:49.681025717 +0000 UTC m=+42.465525588" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.723716 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.723910 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.223870152 +0000 UTC m=+43.008370023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.724387 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.724671 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.22465854 +0000 UTC m=+43.009158411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.771579 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" event={"ID":"080291d9-3dec-4cc7-aeff-f9dbdb7abb68","Type":"ContainerStarted","Data":"83263358a4290566f0e16960039282719da73519799730717969e289bf7c699b"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.795357 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" event={"ID":"4bab7374-a3b1-48d3-97f3-1b3e63392ff3","Type":"ContainerStarted","Data":"5326f878954214c6a1cddd36ab57d613b784b8b766c38188ae4e393ae275170e"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.825665 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.826847 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.32683282 +0000 UTC m=+43.111332691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.834250 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" event={"ID":"aeb7e86c-90a7-48d9-a641-0814203fce0d","Type":"ContainerStarted","Data":"a97b48faa6de07ac92865b3378f54bccdb9c9b04625f68e2ce379d162dd94088"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.884752 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" event={"ID":"6088e2cb-5b23-44f4-87a5-af1d5f36bca3","Type":"ContainerStarted","Data":"8727c20245ed78209fc866ba9dc1041974c6cfafac96d3590ce48d1e47ba2203"} Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.928184 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:49 crc kubenswrapper[4961]: E0120 11:04:49.930088 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.430069904 +0000 UTC m=+43.214569775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.939546 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5rmt2" podStartSLOduration=18.939532028 podStartE2EDuration="18.939532028s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:49.937571982 +0000 UTC m=+42.722071853" watchObservedRunningTime="2026-01-20 11:04:49.939532028 +0000 UTC m=+42.724031899" Jan 20 11:04:49 crc kubenswrapper[4961]: I0120 11:04:49.965206 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fl5hs" podStartSLOduration=18.965179595 podStartE2EDuration="18.965179595s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:49.963025124 +0000 UTC m=+42.747525005" watchObservedRunningTime="2026-01-20 11:04:49.965179595 +0000 UTC m=+42.749679466" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.019713 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" event={"ID":"221c46d0-ccdb-4e6a-a143-04c3bce55711","Type":"ContainerStarted","Data":"299c5de65e96cad27fe4afa404259182ea95159dc51fe9410a74117eb96ac2d0"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.030505 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.030830 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.530809479 +0000 UTC m=+43.315309340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.056427 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t464w" podStartSLOduration=19.056406136 podStartE2EDuration="19.056406136s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.054571562 +0000 UTC m=+42.839071433" watchObservedRunningTime="2026-01-20 11:04:50.056406136 +0000 UTC m=+42.840906007" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.071581 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dhf59" podStartSLOduration=19.071542124 podStartE2EDuration="19.071542124s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.020406763 +0000 UTC m=+42.804906654" watchObservedRunningTime="2026-01-20 11:04:50.071542124 +0000 UTC m=+42.856041995" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.102180 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dghc2" podStartSLOduration=7.102145369 podStartE2EDuration="7.102145369s" podCreationTimestamp="2026-01-20 11:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.090026862 +0000 UTC m=+42.874526733" watchObservedRunningTime="2026-01-20 11:04:50.102145369 +0000 UTC m=+42.886645240" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.108496 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" event={"ID":"088f8776-75dd-4adc-9647-6a03e411313e","Type":"ContainerStarted","Data":"89b03f1eed369f180ab148d4de3a12340ebd4ebab477a9782c9b9b951b4939c9"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.125531 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:50 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:50 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:50 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.125645 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.137506 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.140231 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.64020116 +0000 UTC m=+43.424701031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.209300 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" event={"ID":"154d567d-eccf-4771-a5e0-60b2375d3e8b","Type":"ContainerStarted","Data":"e98f449e1c365db0eea78b8f0c02a899e25a16b9e0564d87672fb919c18c4f21"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.238654 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.239733 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.739688545 +0000 UTC m=+43.524188406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.256024 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cpvtl" event={"ID":"f4c616c0-5852-4a0c-98e7-7d6af398ed2e","Type":"ContainerStarted","Data":"b5e67850075fe7f4d5b952d19216a7430dc98457167a9d8357ac2e1da511f689"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.342666 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.343195 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.843176786 +0000 UTC m=+43.627676657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.376693 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" event={"ID":"aa2b6ee6-f793-43c1-bde1-92e1d8e67754","Type":"ContainerStarted","Data":"01a9caadc3d24daf8f2cfcd77d3b09fc94fe7660acedfacfb166c15f0039e37d"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.378316 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-20 10:59:49 +0000 UTC, rotation deadline is 2026-11-24 03:42:34.128817835 +0000 UTC Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.378344 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7384h37m43.750475777s for next certificate rotation Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.402731 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" event={"ID":"b736ffe7-8ba8-4d20-8831-37d44f8d63de","Type":"ContainerStarted","Data":"23c06fa0624571ada2479e7a0562a9ddb3d0d13cf0a2526e05a1ad08194821de"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.404207 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.421174 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fw9fs" podStartSLOduration=19.421156292 podStartE2EDuration="19.421156292s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.29224629 +0000 UTC m=+43.076746161" watchObservedRunningTime="2026-01-20 11:04:50.421156292 +0000 UTC m=+43.205656163" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.423685 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" event={"ID":"82cc64f1-0377-43e9-94a0-213d82b4a415","Type":"ContainerStarted","Data":"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.424802 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.434499 4961 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5bfjx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.434623 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" podUID="b736ffe7-8ba8-4d20-8831-37d44f8d63de" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.444131 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.445346 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:50.945329254 +0000 UTC m=+43.729829125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.458622 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" event={"ID":"aaca7742-2ff5-4b80-9ea2-ed94aa869684","Type":"ContainerStarted","Data":"103b957d39444f56f4681dd87e75f7a5ce8a08a1640dbfe48efb0cfe05e91e59"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.468578 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xkbs6" podStartSLOduration=19.468559654 podStartE2EDuration="19.468559654s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.425022014 +0000 UTC m=+43.209521885" watchObservedRunningTime="2026-01-20 11:04:50.468559654 +0000 UTC m=+43.253059525" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.468843 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.469723 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" podStartSLOduration=19.469715192 podStartE2EDuration="19.469715192s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.469243061 +0000 UTC m=+43.253742932" watchObservedRunningTime="2026-01-20 11:04:50.469715192 +0000 UTC m=+43.254215063" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.509721 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x6vpd" event={"ID":"c819a2b6-2905-4966-aad3-55cf95ee88ef","Type":"ContainerStarted","Data":"1db622b43174f414aa85486100ca1bb3e468fb4053b7f23d8eff259342a65115"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.516558 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-44twq" event={"ID":"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3","Type":"ContainerStarted","Data":"69f39ea0b6542cbce74e599b71316517ef11d2fcd0ee79ba435c784ad3593902"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.519157 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cczpb"] Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.542817 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" podStartSLOduration=19.542790012 podStartE2EDuration="19.542790012s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.518124528 +0000 UTC m=+43.302624399" watchObservedRunningTime="2026-01-20 11:04:50.542790012 +0000 UTC m=+43.327289893" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.546013 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.555616 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.055587615 +0000 UTC m=+43.840087486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.574812 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" event={"ID":"631d0725-bacc-431d-82ce-6db496387d50","Type":"ContainerStarted","Data":"b42433f5c8757bccf43c052b9575f22490386bc95cd2bad9affd0976d271c686"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.574966 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" podStartSLOduration=19.574953884 podStartE2EDuration="19.574953884s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.572789902 +0000 UTC m=+43.357289773" watchObservedRunningTime="2026-01-20 11:04:50.574953884 +0000 UTC m=+43.359453755" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.576431 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" event={"ID":"79d8764d-52dd-4350-9bef-e079ce9ade6a","Type":"ContainerStarted","Data":"fac32ac78605a3c430432f124a354979f5f9c001d03360f0c1dccd2644236e4d"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.577463 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.583563 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" event={"ID":"0359d20b-0121-43aa-8b56-04fc6210db6e","Type":"ContainerStarted","Data":"da2b9a74ffd0e4e3ad156c1e6bbe5bc2a81f5cd08c4f2d8768830ae3183fbede"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.583690 4961 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jmdhz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.583758 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" podUID="79d8764d-52dd-4350-9bef-e079ce9ade6a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.598275 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" event={"ID":"51f20be1-6fa3-47ce-ac42-6d9a618ae151","Type":"ContainerStarted","Data":"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.603963 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" event={"ID":"28d61801-55c0-4e08-99b9-0a6b3d16fe71","Type":"ContainerStarted","Data":"bb24b38180dc71cb6ab8c6ba22b883cab26fd8509cca868fcad45a75b2dbd1ce"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.607663 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-x6vpd" podStartSLOduration=7.607647918 podStartE2EDuration="7.607647918s" podCreationTimestamp="2026-01-20 11:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.606333277 +0000 UTC m=+43.390833148" watchObservedRunningTime="2026-01-20 11:04:50.607647918 +0000 UTC m=+43.392147789" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.612633 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.626464 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" event={"ID":"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a","Type":"ContainerStarted","Data":"0ab68fdf578580dcd1f694c59ad43bce6eec061bf31e94272495846782071115"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.626523 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" event={"ID":"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a","Type":"ContainerStarted","Data":"135c62d8c7b9296980a1a16cfea963fc16ed706b0279621e3a419b44ab62ac9d"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.648274 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" event={"ID":"60f516db-f145-4aeb-8bdf-d7e4445af01b","Type":"ContainerStarted","Data":"e3d2d12f99acec28761379341b420900c9a9f8193b7052af345609e6a68a7b36"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.648341 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" event={"ID":"60f516db-f145-4aeb-8bdf-d7e4445af01b","Type":"ContainerStarted","Data":"03c1ffba808848b313105af8db0952a6c43d514146c0d59dc48cc9d226d8724c"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.648534 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.650323 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.150298418 +0000 UTC m=+43.934798289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.653763 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" event={"ID":"68816dca-2483-4789-9db6-614582f6c45a","Type":"ContainerStarted","Data":"93050c9432493e6419dccad509ea1dc2714408ef8227c6b882d0cbb1f0031866"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.672339 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" event={"ID":"0702836b-cde4-4ae0-9f51-b855a370c0f5","Type":"ContainerStarted","Data":"e4e8846b5b54e89dbe4ff37c789651fecd2da4cdbbacb28fe961450ad64cb2b4"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.689864 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" event={"ID":"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c","Type":"ContainerStarted","Data":"2d9d0f85d253e081a86ffa0f04602efbf63bfdf57480a378ed653115e7c70509"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.723259 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" event={"ID":"9bc6ab6c-6aaa-4c35-bbbe-24448f57cbf9","Type":"ContainerStarted","Data":"87bc4d104d8c761d50f7de68e0c1ca6377d97dafc2bb8ff44dddb19dd7fc7cca"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.724235 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.750346 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.762339 4961 generic.go:334] "Generic (PLEG): container finished" podID="a81eef7a-7f29-4f79-863e-c3e009b56ad8" containerID="348bf5a1f6c70ea98e85440029e68f5bb0cdbe2a888652bd6c2a723ed676f908" exitCode=0 Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.762476 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" event={"ID":"a81eef7a-7f29-4f79-863e-c3e009b56ad8","Type":"ContainerDied","Data":"348bf5a1f6c70ea98e85440029e68f5bb0cdbe2a888652bd6c2a723ed676f908"} Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.781108 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.260135318 +0000 UTC m=+44.044635189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.788685 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" podStartSLOduration=19.788662334 podStartE2EDuration="19.788662334s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.772948832 +0000 UTC m=+43.557448703" watchObservedRunningTime="2026-01-20 11:04:50.788662334 +0000 UTC m=+43.573162205" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.806498 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" event={"ID":"9f0a10d3-171e-4695-b66e-870bb63e5712","Type":"ContainerStarted","Data":"232c271acf804e3e0dbd9a5f88ceca7ac318d3944a36c7d6350510f967ae9560"} Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.808977 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-7szjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.809020 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7szjl" podUID="cf90e4e1-8fba-4df9-8e30-b392921d4d16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.854654 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.857862 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tz72b" Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.861241 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.361207532 +0000 UTC m=+44.145707403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.861570 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.867684 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.367668635 +0000 UTC m=+44.152168506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.877509 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" podStartSLOduration=19.877493677 podStartE2EDuration="19.877493677s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.876186477 +0000 UTC m=+43.660686348" watchObservedRunningTime="2026-01-20 11:04:50.877493677 +0000 UTC m=+43.661993548" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.877915 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-jq4wg" podStartSLOduration=19.877909897 podStartE2EDuration="19.877909897s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.832999984 +0000 UTC m=+43.617499885" watchObservedRunningTime="2026-01-20 11:04:50.877909897 +0000 UTC m=+43.662409768" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.883612 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cwsq" Jan 20 11:04:50 crc kubenswrapper[4961]: I0120 11:04:50.963473 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:50 crc kubenswrapper[4961]: E0120 11:04:50.964396 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.464375335 +0000 UTC m=+44.248875206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.020764 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" podStartSLOduration=20.020743269 podStartE2EDuration="20.020743269s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:50.953516748 +0000 UTC m=+43.738016619" watchObservedRunningTime="2026-01-20 11:04:51.020743269 +0000 UTC m=+43.805243140" Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.066772 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-l62cr" podStartSLOduration=20.066755079 podStartE2EDuration="20.066755079s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:51.063295407 +0000 UTC m=+43.847795278" watchObservedRunningTime="2026-01-20 11:04:51.066755079 +0000 UTC m=+43.851254950" Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.067909 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" podStartSLOduration=20.067900736 podStartE2EDuration="20.067900736s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:51.022859419 +0000 UTC m=+43.807359290" watchObservedRunningTime="2026-01-20 11:04:51.067900736 +0000 UTC m=+43.852400607" Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.070845 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.071179 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.571166843 +0000 UTC m=+44.355666714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.103197 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:51 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:51 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:51 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.103566 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.174748 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.175634 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.675616046 +0000 UTC m=+44.460115917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.281319 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.281816 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.781800581 +0000 UTC m=+44.566300452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.383952 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.384380 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.884359449 +0000 UTC m=+44.668859320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.488203 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.489077 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:51.989042698 +0000 UTC m=+44.773542569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.592015 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.593262 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.093233475 +0000 UTC m=+44.877733346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.594241 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.594787 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.094772671 +0000 UTC m=+44.879272542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.698312 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.699136 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.199115501 +0000 UTC m=+44.983615372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.800008 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.800782 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.300755248 +0000 UTC m=+45.085255119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.857340 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x6vpd" event={"ID":"c819a2b6-2905-4966-aad3-55cf95ee88ef","Type":"ContainerStarted","Data":"0239898440f45ab9c872186d55cbfc9db9c2d76a44c5b8565a79b7b2b5aa36a5"} Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.910217 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:51 crc kubenswrapper[4961]: E0120 11:04:51.911473 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.411455319 +0000 UTC m=+45.195955190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.921480 4961 generic.go:334] "Generic (PLEG): container finished" podID="ff8f8f6e-bb01-49c6-864a-9a98a57abea8" containerID="60b4ec1b88d62ce3dd54d4c626bb9f11e461cb14a5107a9d7133c6efcf9527da" exitCode=0 Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.921555 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" event={"ID":"ff8f8f6e-bb01-49c6-864a-9a98a57abea8","Type":"ContainerDied","Data":"60b4ec1b88d62ce3dd54d4c626bb9f11e461cb14a5107a9d7133c6efcf9527da"} Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.948361 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" event={"ID":"60f516db-f145-4aeb-8bdf-d7e4445af01b","Type":"ContainerStarted","Data":"b3a75af2741c8a8dcee1eb4e2e371eaf3e45929259ec498c2831b0ff1a9fd6e0"} Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.970243 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" event={"ID":"79d8764d-52dd-4350-9bef-e079ce9ade6a","Type":"ContainerStarted","Data":"446f97a68ee205bc01c0cb0d443193dbf91bf1e4b0c133425fa59ef9afb3809e"} Jan 20 11:04:51 crc kubenswrapper[4961]: I0120 11:04:51.986643 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jmdhz" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.014140 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" event={"ID":"0702836b-cde4-4ae0-9f51-b855a370c0f5","Type":"ContainerStarted","Data":"827c27329aeb54cdbfc46a7a0a914b8f3d848a37fe3f58857db41a02bea7f339"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.014226 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" event={"ID":"0702836b-cde4-4ae0-9f51-b855a370c0f5","Type":"ContainerStarted","Data":"7554c8fd1904fdb8fdbbcec3a4e9119b1215b91e673f05a1df3d598b42ba6b2f"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.016271 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.017251 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.018554 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.518538074 +0000 UTC m=+45.303037945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.067946 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" event={"ID":"631d0725-bacc-431d-82ce-6db496387d50","Type":"ContainerStarted","Data":"50993b95d7e93fb2c123553237cc313cea2b6eb15b209d5be3c8f757ea2df79d"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.067998 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" event={"ID":"631d0725-bacc-431d-82ce-6db496387d50","Type":"ContainerStarted","Data":"be32e33775258def929f4af2668ac59a063b86ac246458285eca7d782f993906"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.113253 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:52 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:52 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:52 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.113573 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.120295 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kdxfq" event={"ID":"a74ca54e-cdb6-4759-9bf3-fc4a3defb11c","Type":"ContainerStarted","Data":"7f71e34b708f461facfbd153b7baf2ffdf90831a2b8ae1344221e6241ba40bb7"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.122108 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.122801 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.622786053 +0000 UTC m=+45.407285924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.174050 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" event={"ID":"a81eef7a-7f29-4f79-863e-c3e009b56ad8","Type":"ContainerStarted","Data":"df091c6a457a217d9e567fcf89bdb567359d37ad3e51217ef8f2dd957415e087"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.212801 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" event={"ID":"348fca40-a376-4560-a301-81c5d7dc93dd","Type":"ContainerStarted","Data":"4f0c038eef7da48106c8382fdc4a83760ba7a430962c12747eb8c504132003a3"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.212862 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" event={"ID":"348fca40-a376-4560-a301-81c5d7dc93dd","Type":"ContainerStarted","Data":"509d48dbf1c6079678765286f996ff03e31806a8008085dc2c55824f14860f55"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.224963 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.225508 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.725489465 +0000 UTC m=+45.509989336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.254985 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cpvtl" event={"ID":"f4c616c0-5852-4a0c-98e7-7d6af398ed2e","Type":"ContainerStarted","Data":"2cc8a23d0b5e37083e2189ac9bf591bb3f6e8a3c02f13d8e7e6172d01becacad"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.268650 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2tk9p" event={"ID":"aaca7742-2ff5-4b80-9ea2-ed94aa869684","Type":"ContainerStarted","Data":"caf24d6cd8d07352ec305631c2465024c5ac4bb119f0600d1a5b91022361671e"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.271972 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" event={"ID":"6088e2cb-5b23-44f4-87a5-af1d5f36bca3","Type":"ContainerStarted","Data":"25d8a0bbcfa860bfb53eed5b2fc98e5c21e9d80c3388fe1f309f13ccca3ca902"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.286814 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" event={"ID":"b736ffe7-8ba8-4d20-8831-37d44f8d63de","Type":"ContainerStarted","Data":"ecc23dd350b7347470860d8c5683dda800a61d314d25c37712e47ccb785b63fb"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.300585 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" event={"ID":"f2e00fdd-4780-4faa-a3c3-75b59218a5a2","Type":"ContainerStarted","Data":"d6916004742e3df16fa01cf839938a7b371a6fdbcd21343bd2ce7af0a1ba488b"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.321645 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" event={"ID":"9f13654a-0829-467a-8faa-8cbba4049aca","Type":"ContainerStarted","Data":"869428b859e88e94c573a1b94f5f51284d10418969452632ee3308c54184d3a6"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.326739 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.327009 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.826988168 +0000 UTC m=+45.611488039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.327358 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.329642 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.82962054 +0000 UTC m=+45.614120411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.348116 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" event={"ID":"221c46d0-ccdb-4e6a-a143-04c3bce55711","Type":"ContainerStarted","Data":"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.348507 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.352572 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bslr7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.352615 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.374643 4961 generic.go:334] "Generic (PLEG): container finished" podID="74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a" containerID="0ab68fdf578580dcd1f694c59ad43bce6eec061bf31e94272495846782071115" exitCode=0 Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.374784 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" event={"ID":"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a","Type":"ContainerDied","Data":"0ab68fdf578580dcd1f694c59ad43bce6eec061bf31e94272495846782071115"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.374819 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" event={"ID":"74b0b7f5-b1ac-4b0c-bb46-0b6ef980149a","Type":"ContainerStarted","Data":"5c42351c6eda6557f5f0c596bd30fd3bd1acc2d7df9b88ce9b339587037420df"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.375721 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.390555 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vbbd6" podStartSLOduration=21.390532512 podStartE2EDuration="21.390532512s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.388553496 +0000 UTC m=+45.173053367" watchObservedRunningTime="2026-01-20 11:04:52.390532512 +0000 UTC m=+45.175032383" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.399634 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" event={"ID":"088f8776-75dd-4adc-9647-6a03e411313e","Type":"ContainerStarted","Data":"2297a7d13020f734584e9597fbe4492d828c169fb40aef182897ea93ab8c49f4"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.421712 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" event={"ID":"4bab7374-a3b1-48d3-97f3-1b3e63392ff3","Type":"ContainerStarted","Data":"62cbdcde55dec3305d0713b06a9d089154f8960f5d5a3206e3ede0128cf5b123"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.431083 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.431216 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.931184305 +0000 UTC m=+45.715684176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.431508 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.431996 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:52.931973964 +0000 UTC m=+45.716473835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.440516 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kgt2j" podStartSLOduration=21.440492995 podStartE2EDuration="21.440492995s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.43943669 +0000 UTC m=+45.223936561" watchObservedRunningTime="2026-01-20 11:04:52.440492995 +0000 UTC m=+45.224992866" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.442211 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" event={"ID":"4e828266-647a-4cc3-9e4e-4d27d1ddbda1","Type":"ContainerStarted","Data":"f80fea6544e045c7c5fd083202cbd634dbc733128b0470ec1a30d58784f93a20"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.458760 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-44twq" event={"ID":"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3","Type":"ContainerStarted","Data":"89f1e0206f3b66066710ab9bb668849456c0e76c82d4ca0d82b71a0c8844504a"} Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.458823 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-44twq" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.460870 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" gracePeriod=30 Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.464951 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5bfjx" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.520349 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" podStartSLOduration=21.520320126 podStartE2EDuration="21.520320126s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.489561837 +0000 UTC m=+45.274061708" watchObservedRunningTime="2026-01-20 11:04:52.520320126 +0000 UTC m=+45.304819997" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.521256 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cpvtl" podStartSLOduration=21.521248108 podStartE2EDuration="21.521248108s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.520956881 +0000 UTC m=+45.305456752" watchObservedRunningTime="2026-01-20 11:04:52.521248108 +0000 UTC m=+45.305747979" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.535629 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.535788 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.035765201 +0000 UTC m=+45.820265072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.536219 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.541382 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.041341593 +0000 UTC m=+45.825841454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.597706 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" podStartSLOduration=21.597684767 podStartE2EDuration="21.597684767s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.560676741 +0000 UTC m=+45.345176602" watchObservedRunningTime="2026-01-20 11:04:52.597684767 +0000 UTC m=+45.382184638" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.638030 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.638417 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.138400492 +0000 UTC m=+45.922900353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.687669 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bwhk5" podStartSLOduration=21.687611057 podStartE2EDuration="21.687611057s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.683977931 +0000 UTC m=+45.468477792" watchObservedRunningTime="2026-01-20 11:04:52.687611057 +0000 UTC m=+45.472110928" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.687958 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b58ws" podStartSLOduration=21.687951825 podStartE2EDuration="21.687951825s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.599545361 +0000 UTC m=+45.384045222" watchObservedRunningTime="2026-01-20 11:04:52.687951825 +0000 UTC m=+45.472451706" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.731912 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-v7tsh" podStartSLOduration=21.731885045 podStartE2EDuration="21.731885045s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.731020655 +0000 UTC m=+45.515520526" watchObservedRunningTime="2026-01-20 11:04:52.731885045 +0000 UTC m=+45.516384916" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.740013 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.740509 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.240494109 +0000 UTC m=+46.024993970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.779194 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" podStartSLOduration=21.779175365 podStartE2EDuration="21.779175365s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.775720343 +0000 UTC m=+45.560220214" watchObservedRunningTime="2026-01-20 11:04:52.779175365 +0000 UTC m=+45.563675236" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.840826 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.841239 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.341223164 +0000 UTC m=+46.125723025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.878945 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" podStartSLOduration=21.878923016999998 podStartE2EDuration="21.878923017s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.877990985 +0000 UTC m=+45.662490866" watchObservedRunningTime="2026-01-20 11:04:52.878923017 +0000 UTC m=+45.663422878" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.879131 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-44twq" podStartSLOduration=9.879125811 podStartE2EDuration="9.879125811s" podCreationTimestamp="2026-01-20 11:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.828975234 +0000 UTC m=+45.613475105" watchObservedRunningTime="2026-01-20 11:04:52.879125811 +0000 UTC m=+45.663625682" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.932966 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2vqvf" podStartSLOduration=21.932941826 podStartE2EDuration="21.932941826s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:52.929704989 +0000 UTC m=+45.714204860" watchObservedRunningTime="2026-01-20 11:04:52.932941826 +0000 UTC m=+45.717441697" Jan 20 11:04:52 crc kubenswrapper[4961]: I0120 11:04:52.942715 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:52 crc kubenswrapper[4961]: E0120 11:04:52.943118 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.443105126 +0000 UTC m=+46.227604997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.046321 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.046747 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.54673229 +0000 UTC m=+46.331232161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.108212 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:53 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:53 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:53 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.108270 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.109493 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6lk7p" podStartSLOduration=22.109480556 podStartE2EDuration="22.109480556s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:53.031279274 +0000 UTC m=+45.815779145" watchObservedRunningTime="2026-01-20 11:04:53.109480556 +0000 UTC m=+45.893980427" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.110710 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m9xqq" podStartSLOduration=22.110700415 podStartE2EDuration="22.110700415s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:53.104468087 +0000 UTC m=+45.888967958" watchObservedRunningTime="2026-01-20 11:04:53.110700415 +0000 UTC m=+45.895200296" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.147848 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.148204 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.648189872 +0000 UTC m=+46.432689743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.248956 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.249354 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.749338117 +0000 UTC m=+46.533837988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.351261 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.351684 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.85166472 +0000 UTC m=+46.636164591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.452728 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.453457 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:53.95344206 +0000 UTC m=+46.737941931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.463768 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cpvtl" event={"ID":"f4c616c0-5852-4a0c-98e7-7d6af398ed2e","Type":"ContainerStarted","Data":"300077e93183a8ab6e3bd59b20e5e726de8aae983b9038e8ffeb817346500597"} Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.466045 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-44twq" event={"ID":"4e3fb6d8-549c-408c-b3bb-e8080a7b45f3","Type":"ContainerStarted","Data":"6b2540cfc51e0d833134813cf6ba0e8a63f6047750344bfb2034da8a4308cd39"} Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.468153 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" event={"ID":"a81eef7a-7f29-4f79-863e-c3e009b56ad8","Type":"ContainerStarted","Data":"49186718a1e473b2e82d9079d6ea871aa6eba9f439751a108525d14406ae337e"} Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.469707 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ngbt8" event={"ID":"4bab7374-a3b1-48d3-97f3-1b3e63392ff3","Type":"ContainerStarted","Data":"2ea20452829ae985bc31ba05f3c59e90bc5c4b7a90e5f21e5bf76de1642897eb"} Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.472093 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" event={"ID":"0359d20b-0121-43aa-8b56-04fc6210db6e","Type":"ContainerStarted","Data":"836a65f6f31a37039295c9015b3bd6b974840458b9975b27624ddf6622e34687"} Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.473329 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bslr7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.473363 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.511760 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.512771 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.522363 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.552539 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.556126 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.556384 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.556728 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.556760 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dph7\" (UniqueName: \"kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.590463 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.090436994 +0000 UTC m=+46.874936865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.657030 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" podStartSLOduration=22.65700915 podStartE2EDuration="22.65700915s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:53.604000925 +0000 UTC m=+46.388500796" watchObservedRunningTime="2026-01-20 11:04:53.65700915 +0000 UTC m=+46.441509021" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.659756 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.659996 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dph7\" (UniqueName: \"kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.660108 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.660165 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.660569 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.660638 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.160621516 +0000 UTC m=+46.945121387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.661048 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.695534 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.696726 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.709046 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.762308 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dph7\" (UniqueName: \"kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7\") pod \"certified-operators-rcphf\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.763255 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.763319 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqlsp\" (UniqueName: \"kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.763342 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.763382 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.763774 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.263756688 +0000 UTC m=+47.048256559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.765445 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.829433 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.866013 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.866292 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.866358 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqlsp\" (UniqueName: \"kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.866381 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.866791 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.866883 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.366865359 +0000 UTC m=+47.151365230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.867254 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.878352 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.892366 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqlsp\" (UniqueName: \"kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp\") pod \"community-operators-7p747\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.894998 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.909806 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.967692 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.967764 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.967793 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:53 crc kubenswrapper[4961]: I0120 11:04:53.967869 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8s7\" (UniqueName: \"kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:53 crc kubenswrapper[4961]: E0120 11:04:53.968212 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.468197149 +0000 UTC m=+47.252697020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.035315 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.059332 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.060554 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.069248 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.069422 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8s7\" (UniqueName: \"kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.069491 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.069512 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.069974 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.070052 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.57003479 +0000 UTC m=+47.354534661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.070139 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.070372 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8f8f6e-bb01-49c6-864a-9a98a57abea8" containerName="collect-profiles" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.070392 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8f8f6e-bb01-49c6-864a-9a98a57abea8" containerName="collect-profiles" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.070475 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8f8f6e-bb01-49c6-864a-9a98a57abea8" containerName="collect-profiles" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.071011 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.079461 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.102969 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:54 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:54 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:54 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.103521 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.122684 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8s7\" (UniqueName: \"kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7\") pod \"certified-operators-nl7rg\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.141603 4961 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.172848 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume\") pod \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.172921 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l22gl\" (UniqueName: \"kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl\") pod \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.172949 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume\") pod \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\" (UID: \"ff8f8f6e-bb01-49c6-864a-9a98a57abea8\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.173216 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.173985 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfpb\" (UniqueName: \"kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.174109 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.174186 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.174624 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ff8f8f6e-bb01-49c6-864a-9a98a57abea8" (UID: "ff8f8f6e-bb01-49c6-864a-9a98a57abea8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.174960 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.674945664 +0000 UTC m=+47.459445535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.191901 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ff8f8f6e-bb01-49c6-864a-9a98a57abea8" (UID: "ff8f8f6e-bb01-49c6-864a-9a98a57abea8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.193358 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl" (OuterVolumeSpecName: "kube-api-access-l22gl") pod "ff8f8f6e-bb01-49c6-864a-9a98a57abea8" (UID: "ff8f8f6e-bb01-49c6-864a-9a98a57abea8"). InnerVolumeSpecName "kube-api-access-l22gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.275329 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.275932 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.275987 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfpb\" (UniqueName: \"kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.276056 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.276356 4961 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.276371 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l22gl\" (UniqueName: \"kubernetes.io/projected/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-kube-api-access-l22gl\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.276381 4961 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff8f8f6e-bb01-49c6-864a-9a98a57abea8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.276973 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.277085 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.277216 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.777190665 +0000 UTC m=+47.561690626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.305143 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfpb\" (UniqueName: \"kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb\") pod \"community-operators-kkb2c\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.326626 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.381848 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.382305 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.882285394 +0000 UTC m=+47.666785365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.431662 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.435557 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.464291 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.482710 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.483197 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 11:04:54.983174552 +0000 UTC m=+47.767674423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.567573 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" event={"ID":"0359d20b-0121-43aa-8b56-04fc6210db6e","Type":"ContainerStarted","Data":"cc47a1a6b2be219e568739b75d4a88c23de0a315737131dc5971ab888dc73538"} Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.567629 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" event={"ID":"0359d20b-0121-43aa-8b56-04fc6210db6e","Type":"ContainerStarted","Data":"97483029ad6f8d2617ecfaa0f60c51a45718326ca3dad77772652493edb19b36"} Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.585804 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: E0120 11:04:54.586186 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 11:04:55.086174251 +0000 UTC m=+47.870674122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-z54rs" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.590327 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" event={"ID":"ff8f8f6e-bb01-49c6-864a-9a98a57abea8","Type":"ContainerDied","Data":"cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06"} Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.590365 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdabdf1b24e557b3d3dbbe7531aec1d26107dab4c250948baa02ad7c1fd92f06" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.590423 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29481780-rzljm" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.604958 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nszp2" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.643248 4961 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-20T11:04:54.141638345Z","Handler":null,"Name":""} Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.681817 4961 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.682186 4961 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.691016 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.796796 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.848571 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.898238 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.902932 4961 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.902990 4961 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.960511 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-z54rs\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:54 crc kubenswrapper[4961]: I0120 11:04:54.963194 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:04:55 crc kubenswrapper[4961]: W0120 11:04:55.015931 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod195d7e5d_6ee3_44d3_9e8f_e1a2177fe4e1.slice/crio-fafa7ffe508139dc8fbdb354c63e22aafdc80ebd27e56062faba8954e2442be9 WatchSource:0}: Error finding container fafa7ffe508139dc8fbdb354c63e22aafdc80ebd27e56062faba8954e2442be9: Status 404 returned error can't find the container with id fafa7ffe508139dc8fbdb354c63e22aafdc80ebd27e56062faba8954e2442be9 Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.035657 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.036511 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.042772 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.049130 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.102025 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:55 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:55 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:55 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.102103 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.312943 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:04:55 crc kubenswrapper[4961]: W0120 11:04:55.372921 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c38b7e7_a659_4038_aadf_b54948bfebf4.slice/crio-5c3624bbeb6985b106759993e86bc53d1aa6d084bc0033d114f858da33c151ff WatchSource:0}: Error finding container 5c3624bbeb6985b106759993e86bc53d1aa6d084bc0033d114f858da33c151ff: Status 404 returned error can't find the container with id 5c3624bbeb6985b106759993e86bc53d1aa6d084bc0033d114f858da33c151ff Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.544557 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.595143 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" event={"ID":"2c38b7e7-a659-4038-aadf-b54948bfebf4","Type":"ContainerStarted","Data":"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.595191 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" event={"ID":"2c38b7e7-a659-4038-aadf-b54948bfebf4","Type":"ContainerStarted","Data":"5c3624bbeb6985b106759993e86bc53d1aa6d084bc0033d114f858da33c151ff"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.595256 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.596219 4961 generic.go:334] "Generic (PLEG): container finished" podID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerID="b00047b72b94f4b3ec671a2af92f35d615137755e101ab4ca693cb0f81151f98" exitCode=0 Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.596305 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerDied","Data":"b00047b72b94f4b3ec671a2af92f35d615137755e101ab4ca693cb0f81151f98"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.596348 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerStarted","Data":"fafa7ffe508139dc8fbdb354c63e22aafdc80ebd27e56062faba8954e2442be9"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.597654 4961 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.597957 4961 generic.go:334] "Generic (PLEG): container finished" podID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerID="d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a" exitCode=0 Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.598014 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerDied","Data":"d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.598035 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerStarted","Data":"36460b7aaa5849565684ffe3db3c944519dc840739ad685976f686d5776ab003"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.602591 4961 generic.go:334] "Generic (PLEG): container finished" podID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerID="558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da" exitCode=0 Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.603240 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerDied","Data":"558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.603271 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerStarted","Data":"7b2e1efafecbe4cd722e598738a46bf9466bee26c5d349a2406ef539a5eff86e"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.605992 4961 generic.go:334] "Generic (PLEG): container finished" podID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerID="39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61" exitCode=0 Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.606050 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerDied","Data":"39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.606097 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerStarted","Data":"3694d3b9303e0a6876c6eedabc3495c7796b4724e5a0915b7534c0c86e1a363d"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.613273 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" event={"ID":"0359d20b-0121-43aa-8b56-04fc6210db6e","Type":"ContainerStarted","Data":"d4d25f31d4bb2904597e459359da10a2ec091ea07954b5e83b786d8d37463e30"} Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.619608 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dxxsg" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.629899 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" podStartSLOduration=24.629873283 podStartE2EDuration="24.629873283s" podCreationTimestamp="2026-01-20 11:04:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:55.623482712 +0000 UTC m=+48.407982583" watchObservedRunningTime="2026-01-20 11:04:55.629873283 +0000 UTC m=+48.414373164" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.652585 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.653760 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.656528 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.665866 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.709966 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.711254 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.712428 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2v6b\" (UniqueName: \"kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.756084 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xd5gg" podStartSLOduration=12.756046121 podStartE2EDuration="12.756046121s" podCreationTimestamp="2026-01-20 11:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:04:55.755822095 +0000 UTC m=+48.540321966" watchObservedRunningTime="2026-01-20 11:04:55.756046121 +0000 UTC m=+48.540545992" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.813323 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.813395 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2v6b\" (UniqueName: \"kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.813474 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.813901 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.814237 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.835200 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2v6b\" (UniqueName: \"kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b\") pod \"redhat-marketplace-675m8\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.870024 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.870764 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.873759 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.874238 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.877200 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.914560 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.914816 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.984472 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-7szjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.984540 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7szjl" podUID="cf90e4e1-8fba-4df9-8e30-b392921d4d16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.984472 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-7szjl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.984627 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7szjl" podUID="cf90e4e1-8fba-4df9-8e30-b392921d4d16" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Jan 20 11:04:55 crc kubenswrapper[4961]: I0120 11:04:55.991690 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.004742 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.005501 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.015467 4961 patch_prober.go:28] interesting pod/console-f9d7485db-2cs6z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.015530 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2cs6z" podUID="fc451e0b-ed99-4138-8e62-01d91d2c914f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.015646 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.015745 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.016055 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.033441 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.056531 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.057733 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.078212 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.096532 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.099950 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:56 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:56 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:56 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.100008 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.116887 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.116952 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.116974 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.158960 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.159053 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.166176 4961 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7r52t container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]log ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]etcd ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/generic-apiserver-start-informers ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/max-in-flight-filter ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 20 11:04:56 crc kubenswrapper[4961]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 20 11:04:56 crc kubenswrapper[4961]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/project.openshift.io-projectcache ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/openshift.io-startinformers ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 20 11:04:56 crc kubenswrapper[4961]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 20 11:04:56 crc kubenswrapper[4961]: livez check failed Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.166279 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" podUID="a81eef7a-7f29-4f79-863e-c3e009b56ad8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:56 crc kubenswrapper[4961]: E0120 11:04:56.181609 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.189208 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:56 crc kubenswrapper[4961]: E0120 11:04:56.196742 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:04:56 crc kubenswrapper[4961]: E0120 11:04:56.202565 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:04:56 crc kubenswrapper[4961]: E0120 11:04:56.202621 4961 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.222033 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.222168 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.222233 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.223044 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.223444 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.242895 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq\") pod \"redhat-marketplace-4qwb6\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.297885 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:04:56 crc kubenswrapper[4961]: W0120 11:04:56.330302 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52943ef2_6fee_4910_8dd2_3723b3575824.slice/crio-eb7b8a712cf9ec2441c725e853b77acb524183007d51d1a80ca2223eba6e8395 WatchSource:0}: Error finding container eb7b8a712cf9ec2441c725e853b77acb524183007d51d1a80ca2223eba6e8395: Status 404 returned error can't find the container with id eb7b8a712cf9ec2441c725e853b77acb524183007d51d1a80ca2223eba6e8395 Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.391859 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.509834 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.631754 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ea770e1-2ed6-4390-9cfd-edc3af318f4d","Type":"ContainerStarted","Data":"fcea5244295295edb802024d653f7fe5d320f9848bdc9801a93d43579eea15ff"} Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.652383 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerStarted","Data":"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea"} Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.652429 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerStarted","Data":"eb7b8a712cf9ec2441c725e853b77acb524183007d51d1a80ca2223eba6e8395"} Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.657927 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.659261 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.662510 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.670041 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.743775 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvpbg\" (UniqueName: \"kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.743892 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.751814 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.857563 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.858288 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.858379 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.858831 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.866119 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvpbg\" (UniqueName: \"kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.892434 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvpbg\" (UniqueName: \"kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg\") pod \"redhat-operators-h2djc\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.926993 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:04:56 crc kubenswrapper[4961]: W0120 11:04:56.946880 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4148776c_da93_4f3a_b552_fd5ea25d572b.slice/crio-ca40e582b4ddc0d82425bd9a51cf91b8be7c4c5c5e6f6496af7087213c3a202c WatchSource:0}: Error finding container ca40e582b4ddc0d82425bd9a51cf91b8be7c4c5c5e6f6496af7087213c3a202c: Status 404 returned error can't find the container with id ca40e582b4ddc0d82425bd9a51cf91b8be7c4c5c5e6f6496af7087213c3a202c Jan 20 11:04:56 crc kubenswrapper[4961]: I0120 11:04:56.963581 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.051579 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.053347 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.065362 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.066124 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.067927 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.068010 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.068095 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqzc\" (UniqueName: \"kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.104088 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:57 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:57 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:57 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.104180 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.176765 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqzc\" (UniqueName: \"kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.176872 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.176937 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.177916 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.178115 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.206384 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqzc\" (UniqueName: \"kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc\") pod \"redhat-operators-9gx4d\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.380456 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.380967 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.380999 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.381043 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.384415 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.389304 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.392684 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.393104 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.393362 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.454752 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.458231 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.487120 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.688107 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ea770e1-2ed6-4390-9cfd-edc3af318f4d","Type":"ContainerStarted","Data":"852ed11a0939fe4d11adc52178cee1db4c52c7f44da6261cf2b7d50f4eed0c80"} Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.714036 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.715107 4961 generic.go:334] "Generic (PLEG): container finished" podID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerID="7eae1a1a131f63837ab533c3e56e4cebf83c6c72175e61f1121ec9e58539ec48" exitCode=0 Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.715155 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerDied","Data":"7eae1a1a131f63837ab533c3e56e4cebf83c6c72175e61f1121ec9e58539ec48"} Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.715175 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerStarted","Data":"ca40e582b4ddc0d82425bd9a51cf91b8be7c4c5c5e6f6496af7087213c3a202c"} Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.719831 4961 generic.go:334] "Generic (PLEG): container finished" podID="52943ef2-6fee-4910-8dd2-3723b3575824" containerID="dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea" exitCode=0 Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.720521 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerDied","Data":"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea"} Jan 20 11:04:57 crc kubenswrapper[4961]: I0120 11:04:57.852707 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:04:57 crc kubenswrapper[4961]: W0120 11:04:57.913694 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod454f3576_0963_4929_ac48_4651f534a99c.slice/crio-063cc9275e882af71517bcc92a1c4677b084c305843470136e717573b2d38fa0 WatchSource:0}: Error finding container 063cc9275e882af71517bcc92a1c4677b084c305843470136e717573b2d38fa0: Status 404 returned error can't find the container with id 063cc9275e882af71517bcc92a1c4677b084c305843470136e717573b2d38fa0 Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.103712 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:58 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:58 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:58 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.103806 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:58 crc kubenswrapper[4961]: W0120 11:04:58.182341 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-195e80b1298d120ca589e88fb3114255fd08838afedb87330e7ba852d9e8240b WatchSource:0}: Error finding container 195e80b1298d120ca589e88fb3114255fd08838afedb87330e7ba852d9e8240b: Status 404 returned error can't find the container with id 195e80b1298d120ca589e88fb3114255fd08838afedb87330e7ba852d9e8240b Jan 20 11:04:58 crc kubenswrapper[4961]: W0120 11:04:58.183215 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-1378d723ab2878612355408451fce17eebfb17b8951f70841bf127bc96d39ae4 WatchSource:0}: Error finding container 1378d723ab2878612355408451fce17eebfb17b8951f70841bf127bc96d39ae4: Status 404 returned error can't find the container with id 1378d723ab2878612355408451fce17eebfb17b8951f70841bf127bc96d39ae4 Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.732091 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ea770e1-2ed6-4390-9cfd-edc3af318f4d","Type":"ContainerDied","Data":"852ed11a0939fe4d11adc52178cee1db4c52c7f44da6261cf2b7d50f4eed0c80"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.732053 4961 generic.go:334] "Generic (PLEG): container finished" podID="3ea770e1-2ed6-4390-9cfd-edc3af318f4d" containerID="852ed11a0939fe4d11adc52178cee1db4c52c7f44da6261cf2b7d50f4eed0c80" exitCode=0 Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.734588 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"195e80b1298d120ca589e88fb3114255fd08838afedb87330e7ba852d9e8240b"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.736362 4961 generic.go:334] "Generic (PLEG): container finished" podID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerID="cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0" exitCode=0 Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.736436 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerDied","Data":"cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.736476 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerStarted","Data":"ead4b0c5adc9712208f8252246af9c6ab7ed282ec2ae149cfca2574d57e744d1"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.740081 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerStarted","Data":"063cc9275e882af71517bcc92a1c4677b084c305843470136e717573b2d38fa0"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.741373 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe94b4f7bcee0d91406d0298df73fe16abcf9a6bdf836ac0dd1e4b96b4be43cd"} Jan 20 11:04:58 crc kubenswrapper[4961]: I0120 11:04:58.749267 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1378d723ab2878612355408451fce17eebfb17b8951f70841bf127bc96d39ae4"} Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.050999 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.102868 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:04:59 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:04:59 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:04:59 crc kubenswrapper[4961]: healthz check failed Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.103280 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.110308 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access\") pod \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.110371 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir\") pod \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\" (UID: \"3ea770e1-2ed6-4390-9cfd-edc3af318f4d\") " Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.110623 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ea770e1-2ed6-4390-9cfd-edc3af318f4d" (UID: "3ea770e1-2ed6-4390-9cfd-edc3af318f4d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.111342 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.119024 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ea770e1-2ed6-4390-9cfd-edc3af318f4d" (UID: "3ea770e1-2ed6-4390-9cfd-edc3af318f4d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.213362 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ea770e1-2ed6-4390-9cfd-edc3af318f4d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.788633 4961 generic.go:334] "Generic (PLEG): container finished" podID="454f3576-0963-4929-ac48-4651f534a99c" containerID="e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3" exitCode=0 Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.789165 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerDied","Data":"e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3"} Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.803199 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"214301eed9568af6edf4593b8216d540a20d66ef8ceceef76fa8defd1582b765"} Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.803365 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.807023 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c68f8802569f173683aabdcd58f57daf8e92851c505656ca1b8dbc2264330a6c"} Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.810271 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ea770e1-2ed6-4390-9cfd-edc3af318f4d","Type":"ContainerDied","Data":"fcea5244295295edb802024d653f7fe5d320f9848bdc9801a93d43579eea15ff"} Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.810315 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcea5244295295edb802024d653f7fe5d320f9848bdc9801a93d43579eea15ff" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.810391 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 11:04:59 crc kubenswrapper[4961]: I0120 11:04:59.818597 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"03f16343e5d5ad41cca224861d7c1ba139aa7afa09848481934c78c988f1d936"} Jan 20 11:05:00 crc kubenswrapper[4961]: I0120 11:05:00.098219 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:00 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:00 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:00 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:00 crc kubenswrapper[4961]: I0120 11:05:00.098282 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:00 crc kubenswrapper[4961]: I0120 11:05:00.743262 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 11:05:00 crc kubenswrapper[4961]: I0120 11:05:00.769227 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.098292 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:01 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:01 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:01 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.098394 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.163715 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.169697 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7r52t" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.226135 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.226113401 podStartE2EDuration="1.226113401s" podCreationTimestamp="2026-01-20 11:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:01.189248998 +0000 UTC m=+53.973748869" watchObservedRunningTime="2026-01-20 11:05:01.226113401 +0000 UTC m=+54.010613272" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.777612 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 11:05:01 crc kubenswrapper[4961]: E0120 11:05:01.778543 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ea770e1-2ed6-4390-9cfd-edc3af318f4d" containerName="pruner" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.778566 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ea770e1-2ed6-4390-9cfd-edc3af318f4d" containerName="pruner" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.778712 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ea770e1-2ed6-4390-9cfd-edc3af318f4d" containerName="pruner" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.779433 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.782493 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.785717 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.787584 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.896182 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.896235 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.998720 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:01 crc kubenswrapper[4961]: I0120 11:05:01.998788 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.000569 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.039501 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.093001 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-44twq" Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.100122 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:02 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:02 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:02 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.100175 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:02 crc kubenswrapper[4961]: I0120 11:05:02.125262 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:03 crc kubenswrapper[4961]: I0120 11:05:03.099106 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:03 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:03 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:03 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:03 crc kubenswrapper[4961]: I0120 11:05:03.099511 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:04 crc kubenswrapper[4961]: I0120 11:05:04.098879 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:04 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:04 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:04 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:04 crc kubenswrapper[4961]: I0120 11:05:04.098948 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:05 crc kubenswrapper[4961]: I0120 11:05:05.098364 4961 patch_prober.go:28] interesting pod/router-default-5444994796-6xx2v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 11:05:05 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Jan 20 11:05:05 crc kubenswrapper[4961]: [+]process-running ok Jan 20 11:05:05 crc kubenswrapper[4961]: healthz check failed Jan 20 11:05:05 crc kubenswrapper[4961]: I0120 11:05:05.098441 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6xx2v" podUID="d5ac0979-0fc9-48a6-8d22-6ba2c646287a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.005714 4961 patch_prober.go:28] interesting pod/console-f9d7485db-2cs6z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.005797 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2cs6z" podUID="fc451e0b-ed99-4138-8e62-01d91d2c914f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.012403 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7szjl" Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.099689 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.103461 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6xx2v" Jan 20 11:05:06 crc kubenswrapper[4961]: E0120 11:05:06.188541 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:06 crc kubenswrapper[4961]: E0120 11:05:06.191294 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:06 crc kubenswrapper[4961]: E0120 11:05:06.193149 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:06 crc kubenswrapper[4961]: E0120 11:05:06.193189 4961 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:06 crc kubenswrapper[4961]: I0120 11:05:06.292940 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vzlgg" Jan 20 11:05:15 crc kubenswrapper[4961]: I0120 11:05:15.062609 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:05:16 crc kubenswrapper[4961]: I0120 11:05:16.107257 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:05:16 crc kubenswrapper[4961]: I0120 11:05:16.112532 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2cs6z" Jan 20 11:05:16 crc kubenswrapper[4961]: E0120 11:05:16.212856 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:16 crc kubenswrapper[4961]: E0120 11:05:16.215043 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:16 crc kubenswrapper[4961]: E0120 11:05:16.216924 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:16 crc kubenswrapper[4961]: E0120 11:05:16.216969 4961 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:22 crc kubenswrapper[4961]: I0120 11:05:22.974118 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cczpb_68d3d73a-7ef8-49ee-ae94-2d73115e126e/kube-multus-additional-cni-plugins/0.log" Jan 20 11:05:22 crc kubenswrapper[4961]: I0120 11:05:22.974860 4961 generic.go:334] "Generic (PLEG): container finished" podID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" exitCode=137 Jan 20 11:05:22 crc kubenswrapper[4961]: I0120 11:05:22.974902 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" event={"ID":"68d3d73a-7ef8-49ee-ae94-2d73115e126e","Type":"ContainerDied","Data":"ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca"} Jan 20 11:05:24 crc kubenswrapper[4961]: E0120 11:05:24.577613 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 20 11:05:24 crc kubenswrapper[4961]: E0120 11:05:24.578283 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hqlsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7p747_openshift-marketplace(4155767c-ce93-427a-9a44-d02d9fa3ac62): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:24 crc kubenswrapper[4961]: E0120 11:05:24.579735 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7p747" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" Jan 20 11:05:26 crc kubenswrapper[4961]: E0120 11:05:26.177015 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca is running failed: container process not found" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:26 crc kubenswrapper[4961]: E0120 11:05:26.178130 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca is running failed: container process not found" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:26 crc kubenswrapper[4961]: E0120 11:05:26.178389 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca is running failed: container process not found" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 20 11:05:26 crc kubenswrapper[4961]: E0120 11:05:26.178417 4961 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:26 crc kubenswrapper[4961]: I0120 11:05:26.971946 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zm7fw" Jan 20 11:05:27 crc kubenswrapper[4961]: E0120 11:05:27.374615 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7p747" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" Jan 20 11:05:27 crc kubenswrapper[4961]: E0120 11:05:27.489549 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 20 11:05:27 crc kubenswrapper[4961]: E0120 11:05:27.490205 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwfpb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kkb2c_openshift-marketplace(195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:27 crc kubenswrapper[4961]: E0120 11:05:27.491500 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kkb2c" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.826781 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kkb2c" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.914362 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.914528 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hx8s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nl7rg_openshift-marketplace(b27913bc-b262-4eb2-aeee-02a7365a3770): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.915830 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nl7rg" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.916318 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.916500 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dph7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rcphf_openshift-marketplace(3449c15e-8212-40ed-85f5-37a0f79fd9e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:28 crc kubenswrapper[4961]: E0120 11:05:28.917701 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rcphf" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" Jan 20 11:05:32 crc kubenswrapper[4961]: E0120 11:05:32.143385 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rcphf" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" Jan 20 11:05:32 crc kubenswrapper[4961]: E0120 11:05:32.143449 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nl7rg" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" Jan 20 11:05:33 crc kubenswrapper[4961]: E0120 11:05:33.392990 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 11:05:33 crc kubenswrapper[4961]: E0120 11:05:33.393546 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66ppq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4qwb6_openshift-marketplace(4148776c-da93-4f3a-b552-fd5ea25d572b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:33 crc kubenswrapper[4961]: E0120 11:05:33.394727 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4qwb6" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.513353 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cczpb_68d3d73a-7ef8-49ee-ae94-2d73115e126e/kube-multus-additional-cni-plugins/0.log" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.513429 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.715807 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir\") pod \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.716215 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") pod \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.716316 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready\") pod \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.716370 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs7d2\" (UniqueName: \"kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2\") pod \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\" (UID: \"68d3d73a-7ef8-49ee-ae94-2d73115e126e\") " Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.715983 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "68d3d73a-7ef8-49ee-ae94-2d73115e126e" (UID: "68d3d73a-7ef8-49ee-ae94-2d73115e126e"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.716638 4961 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68d3d73a-7ef8-49ee-ae94-2d73115e126e-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.716977 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready" (OuterVolumeSpecName: "ready") pod "68d3d73a-7ef8-49ee-ae94-2d73115e126e" (UID: "68d3d73a-7ef8-49ee-ae94-2d73115e126e"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.717267 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "68d3d73a-7ef8-49ee-ae94-2d73115e126e" (UID: "68d3d73a-7ef8-49ee-ae94-2d73115e126e"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.721646 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2" (OuterVolumeSpecName: "kube-api-access-vs7d2") pod "68d3d73a-7ef8-49ee-ae94-2d73115e126e" (UID: "68d3d73a-7ef8-49ee-ae94-2d73115e126e"). InnerVolumeSpecName "kube-api-access-vs7d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.817227 4961 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/68d3d73a-7ef8-49ee-ae94-2d73115e126e-ready\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.817265 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs7d2\" (UniqueName: \"kubernetes.io/projected/68d3d73a-7ef8-49ee-ae94-2d73115e126e-kube-api-access-vs7d2\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.817279 4961 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68d3d73a-7ef8-49ee-ae94-2d73115e126e-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:33 crc kubenswrapper[4961]: I0120 11:05:33.851813 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.038157 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9736b05e-fb05-47e6-af9f-eae93b064566","Type":"ContainerStarted","Data":"7943ba58cdf38d978eb0d74f6956fc2ef02d1895c003ce66038d4540aedb4111"} Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.040413 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-cczpb_68d3d73a-7ef8-49ee-ae94-2d73115e126e/kube-multus-additional-cni-plugins/0.log" Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.040532 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" event={"ID":"68d3d73a-7ef8-49ee-ae94-2d73115e126e","Type":"ContainerDied","Data":"7f08d3d4d95c2a1776457b5b7921be61184652f9f827f1cdd8b7df0f01586d5e"} Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.040555 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-cczpb" Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.040627 4961 scope.go:117] "RemoveContainer" containerID="ed0053bbd24ab63b565e2904653176a5d7ca1ce157ddfe501a4cca835acb64ca" Jan 20 11:05:34 crc kubenswrapper[4961]: E0120 11:05:34.042374 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4qwb6" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.080293 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cczpb"] Jan 20 11:05:34 crc kubenswrapper[4961]: I0120 11:05:34.083996 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-cczpb"] Jan 20 11:05:34 crc kubenswrapper[4961]: E0120 11:05:34.742294 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 11:05:34 crc kubenswrapper[4961]: E0120 11:05:34.742796 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2v6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-675m8_openshift-marketplace(52943ef2-6fee-4910-8dd2-3723b3575824): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 11:05:34 crc kubenswrapper[4961]: E0120 11:05:34.744023 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-675m8" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" Jan 20 11:05:35 crc kubenswrapper[4961]: I0120 11:05:35.047510 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9736b05e-fb05-47e6-af9f-eae93b064566","Type":"ContainerStarted","Data":"1a8ad30010dfa274b1a6d71fd44bebe2709a91b5021c66f24e73cc0d9676ae54"} Jan 20 11:05:35 crc kubenswrapper[4961]: I0120 11:05:35.049949 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerStarted","Data":"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3"} Jan 20 11:05:35 crc kubenswrapper[4961]: E0120 11:05:35.058249 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-675m8" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" Jan 20 11:05:35 crc kubenswrapper[4961]: I0120 11:05:35.084143 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=34.084117637 podStartE2EDuration="34.084117637s" podCreationTimestamp="2026-01-20 11:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:35.061453372 +0000 UTC m=+87.845953243" watchObservedRunningTime="2026-01-20 11:05:35.084117637 +0000 UTC m=+87.868617518" Jan 20 11:05:35 crc kubenswrapper[4961]: I0120 11:05:35.550032 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" path="/var/lib/kubelet/pods/68d3d73a-7ef8-49ee-ae94-2d73115e126e/volumes" Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.060031 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9736b05e-fb05-47e6-af9f-eae93b064566","Type":"ContainerDied","Data":"1a8ad30010dfa274b1a6d71fd44bebe2709a91b5021c66f24e73cc0d9676ae54"} Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.059828 4961 generic.go:334] "Generic (PLEG): container finished" podID="9736b05e-fb05-47e6-af9f-eae93b064566" containerID="1a8ad30010dfa274b1a6d71fd44bebe2709a91b5021c66f24e73cc0d9676ae54" exitCode=0 Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.063960 4961 generic.go:334] "Generic (PLEG): container finished" podID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerID="0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3" exitCode=0 Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.064031 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerDied","Data":"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3"} Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.066571 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerStarted","Data":"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395"} Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.955944 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 11:05:36 crc kubenswrapper[4961]: E0120 11:05:36.956992 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.957015 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.957176 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d3d73a-7ef8-49ee-ae94-2d73115e126e" containerName="kube-multus-additional-cni-plugins" Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.957646 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:36 crc kubenswrapper[4961]: I0120 11:05:36.968697 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.062643 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.062842 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.073598 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerStarted","Data":"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef"} Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.076109 4961 generic.go:334] "Generic (PLEG): container finished" podID="454f3576-0963-4929-ac48-4651f534a99c" containerID="c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395" exitCode=0 Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.076214 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerDied","Data":"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395"} Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.093075 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h2djc" podStartSLOduration=3.157280979 podStartE2EDuration="41.093045293s" podCreationTimestamp="2026-01-20 11:04:56 +0000 UTC" firstStartedPulling="2026-01-20 11:04:58.742137236 +0000 UTC m=+51.526637107" lastFinishedPulling="2026-01-20 11:05:36.67790155 +0000 UTC m=+89.462401421" observedRunningTime="2026-01-20 11:05:37.092731216 +0000 UTC m=+89.877231087" watchObservedRunningTime="2026-01-20 11:05:37.093045293 +0000 UTC m=+89.877545164" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.163806 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.163870 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.164572 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.183005 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.275643 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.430261 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.472712 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access\") pod \"9736b05e-fb05-47e6-af9f-eae93b064566\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.472997 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir\") pod \"9736b05e-fb05-47e6-af9f-eae93b064566\" (UID: \"9736b05e-fb05-47e6-af9f-eae93b064566\") " Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.473093 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9736b05e-fb05-47e6-af9f-eae93b064566" (UID: "9736b05e-fb05-47e6-af9f-eae93b064566"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.473370 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9736b05e-fb05-47e6-af9f-eae93b064566-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.474986 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.488523 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9736b05e-fb05-47e6-af9f-eae93b064566" (UID: "9736b05e-fb05-47e6-af9f-eae93b064566"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.561489 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.574943 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9736b05e-fb05-47e6-af9f-eae93b064566-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:37 crc kubenswrapper[4961]: I0120 11:05:37.718879 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 11:05:37 crc kubenswrapper[4961]: W0120 11:05:37.730251 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod801026d7_50ac_4add_ac8a_a44eaa67ac43.slice/crio-311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb WatchSource:0}: Error finding container 311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb: Status 404 returned error can't find the container with id 311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.081788 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"801026d7-50ac-4add-ac8a-a44eaa67ac43","Type":"ContainerStarted","Data":"ce23f8e732ff415da655affc975b5ea6426c6b722f27e25fc1d5bfe7c766d801"} Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.081844 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"801026d7-50ac-4add-ac8a-a44eaa67ac43","Type":"ContainerStarted","Data":"311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb"} Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.084113 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerStarted","Data":"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d"} Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.086176 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.086561 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9736b05e-fb05-47e6-af9f-eae93b064566","Type":"ContainerDied","Data":"7943ba58cdf38d978eb0d74f6956fc2ef02d1895c003ce66038d4540aedb4111"} Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.086582 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7943ba58cdf38d978eb0d74f6956fc2ef02d1895c003ce66038d4540aedb4111" Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.111449 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.111428786 podStartE2EDuration="2.111428786s" podCreationTimestamp="2026-01-20 11:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:38.096379014 +0000 UTC m=+90.880878875" watchObservedRunningTime="2026-01-20 11:05:38.111428786 +0000 UTC m=+90.895928657" Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.111844 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.111838676 podStartE2EDuration="1.111838676s" podCreationTimestamp="2026-01-20 11:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:38.107897531 +0000 UTC m=+90.892397412" watchObservedRunningTime="2026-01-20 11:05:38.111838676 +0000 UTC m=+90.896338547" Jan 20 11:05:38 crc kubenswrapper[4961]: I0120 11:05:38.129893 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9gx4d" podStartSLOduration=3.370742854 podStartE2EDuration="41.12986239s" podCreationTimestamp="2026-01-20 11:04:57 +0000 UTC" firstStartedPulling="2026-01-20 11:04:59.793247523 +0000 UTC m=+52.577747394" lastFinishedPulling="2026-01-20 11:05:37.552367059 +0000 UTC m=+90.336866930" observedRunningTime="2026-01-20 11:05:38.127320399 +0000 UTC m=+90.911820280" watchObservedRunningTime="2026-01-20 11:05:38.12986239 +0000 UTC m=+90.914362261" Jan 20 11:05:39 crc kubenswrapper[4961]: I0120 11:05:39.095015 4961 generic.go:334] "Generic (PLEG): container finished" podID="801026d7-50ac-4add-ac8a-a44eaa67ac43" containerID="ce23f8e732ff415da655affc975b5ea6426c6b722f27e25fc1d5bfe7c766d801" exitCode=0 Jan 20 11:05:39 crc kubenswrapper[4961]: I0120 11:05:39.095074 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"801026d7-50ac-4add-ac8a-a44eaa67ac43","Type":"ContainerDied","Data":"ce23f8e732ff415da655affc975b5ea6426c6b722f27e25fc1d5bfe7c766d801"} Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.337767 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.406921 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir\") pod \"801026d7-50ac-4add-ac8a-a44eaa67ac43\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.407084 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access\") pod \"801026d7-50ac-4add-ac8a-a44eaa67ac43\" (UID: \"801026d7-50ac-4add-ac8a-a44eaa67ac43\") " Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.407192 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "801026d7-50ac-4add-ac8a-a44eaa67ac43" (UID: "801026d7-50ac-4add-ac8a-a44eaa67ac43"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.407878 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/801026d7-50ac-4add-ac8a-a44eaa67ac43-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.414401 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "801026d7-50ac-4add-ac8a-a44eaa67ac43" (UID: "801026d7-50ac-4add-ac8a-a44eaa67ac43"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:05:40 crc kubenswrapper[4961]: I0120 11:05:40.509224 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/801026d7-50ac-4add-ac8a-a44eaa67ac43-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:41 crc kubenswrapper[4961]: I0120 11:05:41.107341 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"801026d7-50ac-4add-ac8a-a44eaa67ac43","Type":"ContainerDied","Data":"311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb"} Jan 20 11:05:41 crc kubenswrapper[4961]: I0120 11:05:41.107395 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="311d3e2e2f3974c21eef26f426f1a578ba7c4bf3d8da73ba8c785256f3febcbb" Jan 20 11:05:41 crc kubenswrapper[4961]: I0120 11:05:41.107440 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 11:05:41 crc kubenswrapper[4961]: I0120 11:05:41.576280 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.156833 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 11:05:44 crc kubenswrapper[4961]: E0120 11:05:44.157376 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801026d7-50ac-4add-ac8a-a44eaa67ac43" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.157390 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="801026d7-50ac-4add-ac8a-a44eaa67ac43" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: E0120 11:05:44.157412 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9736b05e-fb05-47e6-af9f-eae93b064566" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.157418 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="9736b05e-fb05-47e6-af9f-eae93b064566" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.157516 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="801026d7-50ac-4add-ac8a-a44eaa67ac43" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.157532 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="9736b05e-fb05-47e6-af9f-eae93b064566" containerName="pruner" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.157878 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.159548 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.160435 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.162786 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.223829 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.223811133 podStartE2EDuration="3.223811133s" podCreationTimestamp="2026-01-20 11:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:44.204445067 +0000 UTC m=+96.988944948" watchObservedRunningTime="2026-01-20 11:05:44.223811133 +0000 UTC m=+97.008311004" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.266856 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.266920 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.266994 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.368656 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.368795 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.368828 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.368940 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.368957 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.389779 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access\") pod \"installer-9-crc\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:44 crc kubenswrapper[4961]: I0120 11:05:44.473171 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.067125 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.069786 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.393604 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.393958 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.506271 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.507118 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:05:47 crc kubenswrapper[4961]: I0120 11:05:47.566821 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:05:48 crc kubenswrapper[4961]: I0120 11:05:48.146169 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 11:05:48 crc kubenswrapper[4961]: W0120 11:05:48.160472 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7ffc62a1_dc68_4a34_8ebf_11662a07f343.slice/crio-18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1 WatchSource:0}: Error finding container 18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1: Status 404 returned error can't find the container with id 18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1 Jan 20 11:05:48 crc kubenswrapper[4961]: I0120 11:05:48.197461 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:48 crc kubenswrapper[4961]: I0120 11:05:48.972708 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:05:49 crc kubenswrapper[4961]: I0120 11:05:49.159253 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7ffc62a1-dc68-4a34-8ebf-11662a07f343","Type":"ContainerStarted","Data":"917ea4298748927ebf6fe44c3c6724e94b2fff418c36f9ac5da48641087bff4e"} Jan 20 11:05:49 crc kubenswrapper[4961]: I0120 11:05:49.159310 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7ffc62a1-dc68-4a34-8ebf-11662a07f343","Type":"ContainerStarted","Data":"18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1"} Jan 20 11:05:49 crc kubenswrapper[4961]: I0120 11:05:49.161880 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerStarted","Data":"13f20f0a3313c85937466bde214681fa0659ec1156660cdd7ed0ffd1e8fe2dd9"} Jan 20 11:05:49 crc kubenswrapper[4961]: I0120 11:05:49.165464 4961 generic.go:334] "Generic (PLEG): container finished" podID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerID="74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304" exitCode=0 Jan 20 11:05:49 crc kubenswrapper[4961]: I0120 11:05:49.165602 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerDied","Data":"74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304"} Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.173288 4961 generic.go:334] "Generic (PLEG): container finished" podID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerID="13f20f0a3313c85937466bde214681fa0659ec1156660cdd7ed0ffd1e8fe2dd9" exitCode=0 Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.173412 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerDied","Data":"13f20f0a3313c85937466bde214681fa0659ec1156660cdd7ed0ffd1e8fe2dd9"} Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.173760 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9gx4d" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="registry-server" containerID="cri-o://b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d" gracePeriod=2 Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.563648 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.584155 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.584135797 podStartE2EDuration="6.584135797s" podCreationTimestamp="2026-01-20 11:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:05:50.23240413 +0000 UTC m=+103.016904021" watchObservedRunningTime="2026-01-20 11:05:50.584135797 +0000 UTC m=+103.368635668" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.751973 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities\") pod \"454f3576-0963-4929-ac48-4651f534a99c\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.752165 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghqzc\" (UniqueName: \"kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc\") pod \"454f3576-0963-4929-ac48-4651f534a99c\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.752590 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content\") pod \"454f3576-0963-4929-ac48-4651f534a99c\" (UID: \"454f3576-0963-4929-ac48-4651f534a99c\") " Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.752779 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities" (OuterVolumeSpecName: "utilities") pod "454f3576-0963-4929-ac48-4651f534a99c" (UID: "454f3576-0963-4929-ac48-4651f534a99c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.752890 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.758832 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc" (OuterVolumeSpecName: "kube-api-access-ghqzc") pod "454f3576-0963-4929-ac48-4651f534a99c" (UID: "454f3576-0963-4929-ac48-4651f534a99c"). InnerVolumeSpecName "kube-api-access-ghqzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.853777 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghqzc\" (UniqueName: \"kubernetes.io/projected/454f3576-0963-4929-ac48-4651f534a99c-kube-api-access-ghqzc\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.890653 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "454f3576-0963-4929-ac48-4651f534a99c" (UID: "454f3576-0963-4929-ac48-4651f534a99c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:05:50 crc kubenswrapper[4961]: I0120 11:05:50.954969 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454f3576-0963-4929-ac48-4651f534a99c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.187447 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerStarted","Data":"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.190480 4961 generic.go:334] "Generic (PLEG): container finished" podID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerID="bd7d0b1af5eb860c86a5d33886ad68a3646ed97765ec5e34ab4ec4e1cbaa6935" exitCode=0 Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.190543 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerDied","Data":"bd7d0b1af5eb860c86a5d33886ad68a3646ed97765ec5e34ab4ec4e1cbaa6935"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.192983 4961 generic.go:334] "Generic (PLEG): container finished" podID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerID="c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee" exitCode=0 Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.193036 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerDied","Data":"c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.197052 4961 generic.go:334] "Generic (PLEG): container finished" podID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerID="983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef" exitCode=0 Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.197142 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerDied","Data":"983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.203139 4961 generic.go:334] "Generic (PLEG): container finished" podID="454f3576-0963-4929-ac48-4651f534a99c" containerID="b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d" exitCode=0 Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.203198 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerDied","Data":"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.203243 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9gx4d" event={"ID":"454f3576-0963-4929-ac48-4651f534a99c","Type":"ContainerDied","Data":"063cc9275e882af71517bcc92a1c4677b084c305843470136e717573b2d38fa0"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.203254 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9gx4d" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.203264 4961 scope.go:117] "RemoveContainer" containerID="b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.208719 4961 generic.go:334] "Generic (PLEG): container finished" podID="52943ef2-6fee-4910-8dd2-3723b3575824" containerID="eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed" exitCode=0 Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.208869 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerDied","Data":"eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.218524 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerStarted","Data":"175a9e9ed085e45ce593b258e40c8e6c59bdd3e7c5e99f8c307a3000605949cc"} Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.222939 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7p747" podStartSLOduration=3.184816102 podStartE2EDuration="58.222908662s" podCreationTimestamp="2026-01-20 11:04:53 +0000 UTC" firstStartedPulling="2026-01-20 11:04:55.59976352 +0000 UTC m=+48.384263391" lastFinishedPulling="2026-01-20 11:05:50.63785609 +0000 UTC m=+103.422355951" observedRunningTime="2026-01-20 11:05:51.21160028 +0000 UTC m=+103.996100151" watchObservedRunningTime="2026-01-20 11:05:51.222908662 +0000 UTC m=+104.007408543" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.233493 4961 scope.go:117] "RemoveContainer" containerID="c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.266272 4961 scope.go:117] "RemoveContainer" containerID="e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.283838 4961 scope.go:117] "RemoveContainer" containerID="b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d" Jan 20 11:05:51 crc kubenswrapper[4961]: E0120 11:05:51.284702 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d\": container with ID starting with b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d not found: ID does not exist" containerID="b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.284786 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d"} err="failed to get container status \"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d\": rpc error: code = NotFound desc = could not find container \"b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d\": container with ID starting with b98fdb52382b8a0399f7cba027708b29c945a0fd9dd4ba448fc6235bac25de2d not found: ID does not exist" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.284860 4961 scope.go:117] "RemoveContainer" containerID="c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395" Jan 20 11:05:51 crc kubenswrapper[4961]: E0120 11:05:51.286304 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395\": container with ID starting with c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395 not found: ID does not exist" containerID="c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.286360 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395"} err="failed to get container status \"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395\": rpc error: code = NotFound desc = could not find container \"c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395\": container with ID starting with c5eeec70f0735f7b05d5da34103a4b5cf446480a4aa4e5c529185f7e929cf395 not found: ID does not exist" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.286395 4961 scope.go:117] "RemoveContainer" containerID="e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3" Jan 20 11:05:51 crc kubenswrapper[4961]: E0120 11:05:51.286869 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3\": container with ID starting with e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3 not found: ID does not exist" containerID="e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.286902 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3"} err="failed to get container status \"e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3\": rpc error: code = NotFound desc = could not find container \"e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3\": container with ID starting with e88fe4bc70bc40e70dc0e4c3c24e3e8a673247ada8398c3cbb8cc8eafd7587b3 not found: ID does not exist" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.327710 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kkb2c" podStartSLOduration=2.234615662 podStartE2EDuration="57.327684524s" podCreationTimestamp="2026-01-20 11:04:54 +0000 UTC" firstStartedPulling="2026-01-20 11:04:55.597375624 +0000 UTC m=+48.381875495" lastFinishedPulling="2026-01-20 11:05:50.690444486 +0000 UTC m=+103.474944357" observedRunningTime="2026-01-20 11:05:51.316868814 +0000 UTC m=+104.101368695" watchObservedRunningTime="2026-01-20 11:05:51.327684524 +0000 UTC m=+104.112184415" Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.328797 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.334153 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9gx4d"] Jan 20 11:05:51 crc kubenswrapper[4961]: I0120 11:05:51.546261 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454f3576-0963-4929-ac48-4651f534a99c" path="/var/lib/kubelet/pods/454f3576-0963-4929-ac48-4651f534a99c/volumes" Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.225460 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerStarted","Data":"19d679693c60ae9439e3dbd74d829c8ae457998e5fa8124e547ea18e557ed530"} Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.228189 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerStarted","Data":"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9"} Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.231189 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerStarted","Data":"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc"} Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.249811 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qwb6" podStartSLOduration=2.247350838 podStartE2EDuration="56.24978737s" podCreationTimestamp="2026-01-20 11:04:56 +0000 UTC" firstStartedPulling="2026-01-20 11:04:57.717852583 +0000 UTC m=+50.502352454" lastFinishedPulling="2026-01-20 11:05:51.720289125 +0000 UTC m=+104.504788986" observedRunningTime="2026-01-20 11:05:52.244354949 +0000 UTC m=+105.028854820" watchObservedRunningTime="2026-01-20 11:05:52.24978737 +0000 UTC m=+105.034287241" Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.270038 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-675m8" podStartSLOduration=3.154657116 podStartE2EDuration="57.270016897s" podCreationTimestamp="2026-01-20 11:04:55 +0000 UTC" firstStartedPulling="2026-01-20 11:04:57.722202966 +0000 UTC m=+50.506702837" lastFinishedPulling="2026-01-20 11:05:51.837562737 +0000 UTC m=+104.622062618" observedRunningTime="2026-01-20 11:05:52.26684799 +0000 UTC m=+105.051347861" watchObservedRunningTime="2026-01-20 11:05:52.270016897 +0000 UTC m=+105.054516778" Jan 20 11:05:52 crc kubenswrapper[4961]: I0120 11:05:52.287622 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcphf" podStartSLOduration=2.958163134 podStartE2EDuration="59.28760221s" podCreationTimestamp="2026-01-20 11:04:53 +0000 UTC" firstStartedPulling="2026-01-20 11:04:55.607316309 +0000 UTC m=+48.391816180" lastFinishedPulling="2026-01-20 11:05:51.936755385 +0000 UTC m=+104.721255256" observedRunningTime="2026-01-20 11:05:52.284946576 +0000 UTC m=+105.069446467" watchObservedRunningTime="2026-01-20 11:05:52.28760221 +0000 UTC m=+105.072102081" Jan 20 11:05:53 crc kubenswrapper[4961]: I0120 11:05:53.239805 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerStarted","Data":"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b"} Jan 20 11:05:53 crc kubenswrapper[4961]: I0120 11:05:53.261149 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nl7rg" podStartSLOduration=3.6071169039999997 podStartE2EDuration="1m0.261131162s" podCreationTimestamp="2026-01-20 11:04:53 +0000 UTC" firstStartedPulling="2026-01-20 11:04:55.603741294 +0000 UTC m=+48.388241165" lastFinishedPulling="2026-01-20 11:05:52.257755552 +0000 UTC m=+105.042255423" observedRunningTime="2026-01-20 11:05:53.259037322 +0000 UTC m=+106.043537193" watchObservedRunningTime="2026-01-20 11:05:53.261131162 +0000 UTC m=+106.045631033" Jan 20 11:05:53 crc kubenswrapper[4961]: I0120 11:05:53.830413 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:05:53 crc kubenswrapper[4961]: I0120 11:05:53.830471 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:05:53 crc kubenswrapper[4961]: I0120 11:05:53.882433 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.036186 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.036225 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.069708 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.327147 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.327461 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.372036 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.432174 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.432227 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:54 crc kubenswrapper[4961]: I0120 11:05:54.469857 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:55 crc kubenswrapper[4961]: I0120 11:05:55.309472 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:55 crc kubenswrapper[4961]: I0120 11:05:55.992286 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:05:55 crc kubenswrapper[4961]: I0120 11:05:55.992362 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.027720 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.306587 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.392280 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.392349 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.427719 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:05:56 crc kubenswrapper[4961]: I0120 11:05:56.774127 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:05:57 crc kubenswrapper[4961]: I0120 11:05:57.266422 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kkb2c" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="registry-server" containerID="cri-o://175a9e9ed085e45ce593b258e40c8e6c59bdd3e7c5e99f8c307a3000605949cc" gracePeriod=2 Jan 20 11:05:57 crc kubenswrapper[4961]: I0120 11:05:57.331630 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.272782 4961 generic.go:334] "Generic (PLEG): container finished" podID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerID="175a9e9ed085e45ce593b258e40c8e6c59bdd3e7c5e99f8c307a3000605949cc" exitCode=0 Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.272876 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerDied","Data":"175a9e9ed085e45ce593b258e40c8e6c59bdd3e7c5e99f8c307a3000605949cc"} Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.853212 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.859778 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content\") pod \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.859829 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfpb\" (UniqueName: \"kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb\") pod \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.859943 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities\") pod \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\" (UID: \"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1\") " Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.863630 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities" (OuterVolumeSpecName: "utilities") pod "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" (UID: "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.866212 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb" (OuterVolumeSpecName: "kube-api-access-jwfpb") pod "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" (UID: "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1"). InnerVolumeSpecName "kube-api-access-jwfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.923242 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" (UID: "195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.960614 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.960870 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfpb\" (UniqueName: \"kubernetes.io/projected/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-kube-api-access-jwfpb\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:58 crc kubenswrapper[4961]: I0120 11:05:58.960950 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.284175 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kkb2c" event={"ID":"195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1","Type":"ContainerDied","Data":"fafa7ffe508139dc8fbdb354c63e22aafdc80ebd27e56062faba8954e2442be9"} Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.284258 4961 scope.go:117] "RemoveContainer" containerID="175a9e9ed085e45ce593b258e40c8e6c59bdd3e7c5e99f8c307a3000605949cc" Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.285621 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kkb2c" Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.315286 4961 scope.go:117] "RemoveContainer" containerID="13f20f0a3313c85937466bde214681fa0659ec1156660cdd7ed0ffd1e8fe2dd9" Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.336442 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.340961 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kkb2c"] Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.369310 4961 scope.go:117] "RemoveContainer" containerID="b00047b72b94f4b3ec671a2af92f35d615137755e101ab4ca693cb0f81151f98" Jan 20 11:05:59 crc kubenswrapper[4961]: I0120 11:05:59.548827 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" path="/var/lib/kubelet/pods/195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1/volumes" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.174924 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.175202 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4qwb6" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="registry-server" containerID="cri-o://19d679693c60ae9439e3dbd74d829c8ae457998e5fa8124e547ea18e557ed530" gracePeriod=2 Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.298748 4961 generic.go:334] "Generic (PLEG): container finished" podID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerID="19d679693c60ae9439e3dbd74d829c8ae457998e5fa8124e547ea18e557ed530" exitCode=0 Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.298788 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerDied","Data":"19d679693c60ae9439e3dbd74d829c8ae457998e5fa8124e547ea18e557ed530"} Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.560104 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.597701 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content\") pod \"4148776c-da93-4f3a-b552-fd5ea25d572b\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.597854 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq\") pod \"4148776c-da93-4f3a-b552-fd5ea25d572b\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.597933 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities\") pod \"4148776c-da93-4f3a-b552-fd5ea25d572b\" (UID: \"4148776c-da93-4f3a-b552-fd5ea25d572b\") " Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.599406 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities" (OuterVolumeSpecName: "utilities") pod "4148776c-da93-4f3a-b552-fd5ea25d572b" (UID: "4148776c-da93-4f3a-b552-fd5ea25d572b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.605390 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq" (OuterVolumeSpecName: "kube-api-access-66ppq") pod "4148776c-da93-4f3a-b552-fd5ea25d572b" (UID: "4148776c-da93-4f3a-b552-fd5ea25d572b"). InnerVolumeSpecName "kube-api-access-66ppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.628144 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4148776c-da93-4f3a-b552-fd5ea25d572b" (UID: "4148776c-da93-4f3a-b552-fd5ea25d572b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.702522 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.702566 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66ppq\" (UniqueName: \"kubernetes.io/projected/4148776c-da93-4f3a-b552-fd5ea25d572b-kube-api-access-66ppq\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:01 crc kubenswrapper[4961]: I0120 11:06:01.702582 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4148776c-da93-4f3a-b552-fd5ea25d572b-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.314820 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qwb6" event={"ID":"4148776c-da93-4f3a-b552-fd5ea25d572b","Type":"ContainerDied","Data":"ca40e582b4ddc0d82425bd9a51cf91b8be7c4c5c5e6f6496af7087213c3a202c"} Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.315335 4961 scope.go:117] "RemoveContainer" containerID="19d679693c60ae9439e3dbd74d829c8ae457998e5fa8124e547ea18e557ed530" Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.314941 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qwb6" Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.341832 4961 scope.go:117] "RemoveContainer" containerID="bd7d0b1af5eb860c86a5d33886ad68a3646ed97765ec5e34ab4ec4e1cbaa6935" Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.358698 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.362002 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qwb6"] Jan 20 11:06:02 crc kubenswrapper[4961]: I0120 11:06:02.382344 4961 scope.go:117] "RemoveContainer" containerID="7eae1a1a131f63837ab533c3e56e4cebf83c6c72175e61f1121ec9e58539ec48" Jan 20 11:06:03 crc kubenswrapper[4961]: I0120 11:06:03.547453 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" path="/var/lib/kubelet/pods/4148776c-da93-4f3a-b552-fd5ea25d572b/volumes" Jan 20 11:06:03 crc kubenswrapper[4961]: I0120 11:06:03.893848 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:06:04 crc kubenswrapper[4961]: I0120 11:06:04.090463 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:06:04 crc kubenswrapper[4961]: I0120 11:06:04.369472 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:06:06 crc kubenswrapper[4961]: I0120 11:06:06.176513 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:06:06 crc kubenswrapper[4961]: I0120 11:06:06.176756 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nl7rg" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="registry-server" containerID="cri-o://6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b" gracePeriod=2 Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.198916 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.276173 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities\") pod \"b27913bc-b262-4eb2-aeee-02a7365a3770\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.276542 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content\") pod \"b27913bc-b262-4eb2-aeee-02a7365a3770\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.276678 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8s7\" (UniqueName: \"kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7\") pod \"b27913bc-b262-4eb2-aeee-02a7365a3770\" (UID: \"b27913bc-b262-4eb2-aeee-02a7365a3770\") " Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.278660 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities" (OuterVolumeSpecName: "utilities") pod "b27913bc-b262-4eb2-aeee-02a7365a3770" (UID: "b27913bc-b262-4eb2-aeee-02a7365a3770"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.282764 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7" (OuterVolumeSpecName: "kube-api-access-hx8s7") pod "b27913bc-b262-4eb2-aeee-02a7365a3770" (UID: "b27913bc-b262-4eb2-aeee-02a7365a3770"). InnerVolumeSpecName "kube-api-access-hx8s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.323245 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b27913bc-b262-4eb2-aeee-02a7365a3770" (UID: "b27913bc-b262-4eb2-aeee-02a7365a3770"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.351856 4961 generic.go:334] "Generic (PLEG): container finished" podID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerID="6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b" exitCode=0 Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.351896 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerDied","Data":"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b"} Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.351928 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nl7rg" event={"ID":"b27913bc-b262-4eb2-aeee-02a7365a3770","Type":"ContainerDied","Data":"7b2e1efafecbe4cd722e598738a46bf9466bee26c5d349a2406ef539a5eff86e"} Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.351945 4961 scope.go:117] "RemoveContainer" containerID="6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.352352 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nl7rg" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.368852 4961 scope.go:117] "RemoveContainer" containerID="c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.380946 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.380982 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8s7\" (UniqueName: \"kubernetes.io/projected/b27913bc-b262-4eb2-aeee-02a7365a3770-kube-api-access-hx8s7\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.380994 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27913bc-b262-4eb2-aeee-02a7365a3770-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.382512 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.394799 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nl7rg"] Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.413458 4961 scope.go:117] "RemoveContainer" containerID="558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.423870 4961 scope.go:117] "RemoveContainer" containerID="6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b" Jan 20 11:06:07 crc kubenswrapper[4961]: E0120 11:06:07.424563 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b\": container with ID starting with 6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b not found: ID does not exist" containerID="6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.424703 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b"} err="failed to get container status \"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b\": rpc error: code = NotFound desc = could not find container \"6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b\": container with ID starting with 6003f745b17bf28837ea7e3c9673c755b1b2edaa618059f58f02e3833740215b not found: ID does not exist" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.424814 4961 scope.go:117] "RemoveContainer" containerID="c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee" Jan 20 11:06:07 crc kubenswrapper[4961]: E0120 11:06:07.425201 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee\": container with ID starting with c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee not found: ID does not exist" containerID="c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.425230 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee"} err="failed to get container status \"c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee\": rpc error: code = NotFound desc = could not find container \"c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee\": container with ID starting with c7b506a0440a3fa821d63529ebb12d67d46a6bc5dfe5116dc27bca660be113ee not found: ID does not exist" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.425248 4961 scope.go:117] "RemoveContainer" containerID="558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da" Jan 20 11:06:07 crc kubenswrapper[4961]: E0120 11:06:07.425701 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da\": container with ID starting with 558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da not found: ID does not exist" containerID="558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.425727 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da"} err="failed to get container status \"558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da\": rpc error: code = NotFound desc = could not find container \"558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da\": container with ID starting with 558d83005f243fb837af1602f2d070d769d1f05679a28b2b5de259418229e9da not found: ID does not exist" Jan 20 11:06:07 crc kubenswrapper[4961]: I0120 11:06:07.551815 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" path="/var/lib/kubelet/pods/b27913bc-b262-4eb2-aeee-02a7365a3770/volumes" Jan 20 11:06:10 crc kubenswrapper[4961]: I0120 11:06:10.573017 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jjg29"] Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.613889 4961 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.614891 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.614919 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.614943 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.614958 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.614984 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615001 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615024 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615042 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615096 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615117 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615138 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615166 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615192 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615209 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615228 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615243 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615266 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615280 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615329 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615345 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="extract-content" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615371 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615386 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.615412 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615451 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="extract-utilities" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615681 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27913bc-b262-4eb2-aeee-02a7365a3770" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615707 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="195d7e5d-6ee3-44d3-9e8f-e1a2177fe4e1" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615731 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="4148776c-da93-4f3a-b552-fd5ea25d572b" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.615753 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="454f3576-0963-4929-ac48-4651f534a99c" containerName="registry-server" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.616590 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618110 4961 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618410 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e" gracePeriod=15 Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618459 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b" gracePeriod=15 Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618516 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404" gracePeriod=15 Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618589 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0" gracePeriod=15 Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.618522 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e" gracePeriod=15 Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.620032 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.620120 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.620150 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.620313 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.620386 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621335 4961 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621570 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621585 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621604 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621612 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621622 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621629 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621637 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621646 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621656 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621663 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621674 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621682 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.621690 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621697 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621803 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621815 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621824 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621836 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621850 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.621859 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 11:06:26 crc kubenswrapper[4961]: E0120 11:06:26.668820 4961 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722398 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722544 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722606 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722641 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722667 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722731 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722736 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722835 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722873 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722894 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.722925 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.723038 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.723132 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824149 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824212 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824297 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824371 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824449 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.824493 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:26 crc kubenswrapper[4961]: I0120 11:06:26.972266 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:27 crc kubenswrapper[4961]: E0120 11:06:27.009129 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c6bb6ea49a5d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 11:06:27.008677332 +0000 UTC m=+139.793177223,LastTimestamp:2026-01-20 11:06:27.008677332 +0000 UTC m=+139.793177223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.478689 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e"} Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.479013 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cc17d72b21e38f9869bc5469a0054ff3fdde9beaf9cebb5dca845b5420948c63"} Jan 20 11:06:27 crc kubenswrapper[4961]: E0120 11:06:27.479938 4961 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.480407 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.481834 4961 generic.go:334] "Generic (PLEG): container finished" podID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" containerID="917ea4298748927ebf6fe44c3c6724e94b2fff418c36f9ac5da48641087bff4e" exitCode=0 Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.481930 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7ffc62a1-dc68-4a34-8ebf-11662a07f343","Type":"ContainerDied","Data":"917ea4298748927ebf6fe44c3c6724e94b2fff418c36f9ac5da48641087bff4e"} Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.482555 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.483036 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.485684 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.488006 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.489209 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b" exitCode=0 Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.489252 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0" exitCode=0 Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.489271 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404" exitCode=0 Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.489288 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e" exitCode=2 Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.489692 4961 scope.go:117] "RemoveContainer" containerID="5ebc519f6d0ad45a8fb85392da0ecb11cbf59c38b9cb59933bc6cc18426f7b5f" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.543887 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:27 crc kubenswrapper[4961]: I0120 11:06:27.544658 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:28 crc kubenswrapper[4961]: I0120 11:06:28.505892 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 11:06:28 crc kubenswrapper[4961]: I0120 11:06:28.890345 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:06:28 crc kubenswrapper[4961]: I0120 11:06:28.891745 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.020762 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.021715 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.022380 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.022781 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054367 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock\") pod \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054423 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir\") pod \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054466 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access\") pod \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\" (UID: \"7ffc62a1-dc68-4a34-8ebf-11662a07f343\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054545 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock" (OuterVolumeSpecName: "var-lock") pod "7ffc62a1-dc68-4a34-8ebf-11662a07f343" (UID: "7ffc62a1-dc68-4a34-8ebf-11662a07f343"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054569 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7ffc62a1-dc68-4a34-8ebf-11662a07f343" (UID: "7ffc62a1-dc68-4a34-8ebf-11662a07f343"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054752 4961 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.054773 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.060372 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7ffc62a1-dc68-4a34-8ebf-11662a07f343" (UID: "7ffc62a1-dc68-4a34-8ebf-11662a07f343"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155316 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155400 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155482 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155480 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155551 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155707 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155863 4961 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155892 4961 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155918 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ffc62a1-dc68-4a34-8ebf-11662a07f343-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.155941 4961 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.514976 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7ffc62a1-dc68-4a34-8ebf-11662a07f343","Type":"ContainerDied","Data":"18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1"} Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.515335 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b062c1bb98897f52af0fce5a54cf678b64df72a6a11b51d1f49529661176f1" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.515016 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.517771 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.518783 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e" exitCode=0 Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.518848 4961 scope.go:117] "RemoveContainer" containerID="69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.518897 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.531390 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.531789 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.538396 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.538767 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.544704 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.545464 4961 scope.go:117] "RemoveContainer" containerID="ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.558318 4961 scope.go:117] "RemoveContainer" containerID="32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.574524 4961 scope.go:117] "RemoveContainer" containerID="482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.595941 4961 scope.go:117] "RemoveContainer" containerID="a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.614383 4961 scope.go:117] "RemoveContainer" containerID="dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.631667 4961 scope.go:117] "RemoveContainer" containerID="69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.632376 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b\": container with ID starting with 69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b not found: ID does not exist" containerID="69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.632426 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b"} err="failed to get container status \"69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b\": rpc error: code = NotFound desc = could not find container \"69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b\": container with ID starting with 69a5f11e3e92b2ac3b75c763acc64d422f2a5877882e99b4ed0c48b03f41868b not found: ID does not exist" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.632456 4961 scope.go:117] "RemoveContainer" containerID="ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.633253 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\": container with ID starting with ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0 not found: ID does not exist" containerID="ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.633273 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0"} err="failed to get container status \"ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\": rpc error: code = NotFound desc = could not find container \"ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0\": container with ID starting with ef3a93f93008fcfb5b46f10e01ea85bd3bdc5a99e3683498ac34f4fe3466e2d0 not found: ID does not exist" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.633286 4961 scope.go:117] "RemoveContainer" containerID="32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.634382 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\": container with ID starting with 32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404 not found: ID does not exist" containerID="32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.634416 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404"} err="failed to get container status \"32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\": rpc error: code = NotFound desc = could not find container \"32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404\": container with ID starting with 32cd6ed8ff62bd299b9410a413c0662071a02537a85fb2d15d1b1a8083abe404 not found: ID does not exist" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.634438 4961 scope.go:117] "RemoveContainer" containerID="482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.634739 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\": container with ID starting with 482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e not found: ID does not exist" containerID="482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.634762 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e"} err="failed to get container status \"482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\": rpc error: code = NotFound desc = could not find container \"482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e\": container with ID starting with 482ece6a2d496362ab7bf45d1d70e97dfb70711b4094a08c3acf411e2bb2bd5e not found: ID does not exist" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.634774 4961 scope.go:117] "RemoveContainer" containerID="a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.636540 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\": container with ID starting with a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e not found: ID does not exist" containerID="a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.636569 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e"} err="failed to get container status \"a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\": rpc error: code = NotFound desc = could not find container \"a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e\": container with ID starting with a5711126878eeb10633f1b78029436e832beb6b014e35e3ba021407ebed1422e not found: ID does not exist" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.636586 4961 scope.go:117] "RemoveContainer" containerID="dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f" Jan 20 11:06:29 crc kubenswrapper[4961]: E0120 11:06:29.636822 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\": container with ID starting with dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f not found: ID does not exist" containerID="dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f" Jan 20 11:06:29 crc kubenswrapper[4961]: I0120 11:06:29.636849 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f"} err="failed to get container status \"dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\": rpc error: code = NotFound desc = could not find container \"dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f\": container with ID starting with dc794a2645a439cde874f878e0d95a3c644dd9c2556535174fcd9a44e418c48f not found: ID does not exist" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.407580 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.408332 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.408626 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.409012 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.409840 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:30 crc kubenswrapper[4961]: I0120 11:06:30.409871 4961 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.410029 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="200ms" Jan 20 11:06:30 crc kubenswrapper[4961]: E0120 11:06:30.610810 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="400ms" Jan 20 11:06:31 crc kubenswrapper[4961]: E0120 11:06:31.013398 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="800ms" Jan 20 11:06:31 crc kubenswrapper[4961]: I0120 11:06:31.723097 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:06:31 crc kubenswrapper[4961]: I0120 11:06:31.723431 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:06:31 crc kubenswrapper[4961]: E0120 11:06:31.814399 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="1.6s" Jan 20 11:06:33 crc kubenswrapper[4961]: E0120 11:06:33.415860 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="3.2s" Jan 20 11:06:33 crc kubenswrapper[4961]: E0120 11:06:33.953227 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.241:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c6bb6ea49a5d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 11:06:27.008677332 +0000 UTC m=+139.793177223,LastTimestamp:2026-01-20 11:06:27.008677332 +0000 UTC m=+139.793177223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 11:06:35 crc kubenswrapper[4961]: I0120 11:06:35.597879 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" containerName="oauth-openshift" containerID="cri-o://9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf" gracePeriod=15 Jan 20 11:06:35 crc kubenswrapper[4961]: I0120 11:06:35.949936 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:06:35 crc kubenswrapper[4961]: I0120 11:06:35.950649 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:35 crc kubenswrapper[4961]: I0120 11:06:35.950822 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146595 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146642 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146674 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146707 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146728 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146757 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146803 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxrm\" (UniqueName: \"kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146831 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146869 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146898 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146942 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.146986 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.147014 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.147052 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca\") pod \"82cc64f1-0377-43e9-94a0-213d82b4a415\" (UID: \"82cc64f1-0377-43e9-94a0-213d82b4a415\") " Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.147342 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148035 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148121 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148397 4961 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148595 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148629 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.148715 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.149135 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.152613 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.153005 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.153718 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.154504 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.154795 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.154865 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.159264 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.160834 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm" (OuterVolumeSpecName: "kube-api-access-7rxrm") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "kube-api-access-7rxrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.176383 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "82cc64f1-0377-43e9-94a0-213d82b4a415" (UID: "82cc64f1-0377-43e9-94a0-213d82b4a415"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249447 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249492 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249510 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249526 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249543 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249557 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249569 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249584 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249596 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rxrm\" (UniqueName: \"kubernetes.io/projected/82cc64f1-0377-43e9-94a0-213d82b4a415-kube-api-access-7rxrm\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249608 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.249621 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82cc64f1-0377-43e9-94a0-213d82b4a415-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 11:06:36 crc kubenswrapper[4961]: E0120 11:06:36.551987 4961 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" volumeName="registry-storage" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.561011 4961 generic.go:334] "Generic (PLEG): container finished" podID="82cc64f1-0377-43e9-94a0-213d82b4a415" containerID="9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf" exitCode=0 Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.561088 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" event={"ID":"82cc64f1-0377-43e9-94a0-213d82b4a415","Type":"ContainerDied","Data":"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf"} Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.561136 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" event={"ID":"82cc64f1-0377-43e9-94a0-213d82b4a415","Type":"ContainerDied","Data":"63396020482cc7e372d5186d1bc2d5a75a304d516d3ba361c67111c453367d65"} Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.561157 4961 scope.go:117] "RemoveContainer" containerID="9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.561225 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.563466 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.564127 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.581132 4961 scope.go:117] "RemoveContainer" containerID="9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf" Jan 20 11:06:36 crc kubenswrapper[4961]: E0120 11:06:36.581671 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf\": container with ID starting with 9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf not found: ID does not exist" containerID="9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.581709 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf"} err="failed to get container status \"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf\": rpc error: code = NotFound desc = could not find container \"9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf\": container with ID starting with 9ab61abb23c6e58eafa0f37b907333cb677d1d66aa2e352ae9ad35bad49af9bf not found: ID does not exist" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.592115 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:36 crc kubenswrapper[4961]: I0120 11:06:36.592607 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:36 crc kubenswrapper[4961]: E0120 11:06:36.617270 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="6.4s" Jan 20 11:06:37 crc kubenswrapper[4961]: I0120 11:06:37.541413 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:37 crc kubenswrapper[4961]: I0120 11:06:37.542095 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.585204 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.585475 4961 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="315a88354b4b31aef2ff2afed632ace0acfd5b7bfd5f822722f82ad407c88bce" exitCode=1 Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.585503 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"315a88354b4b31aef2ff2afed632ace0acfd5b7bfd5f822722f82ad407c88bce"} Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.585977 4961 scope.go:117] "RemoveContainer" containerID="315a88354b4b31aef2ff2afed632ace0acfd5b7bfd5f822722f82ad407c88bce" Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.586855 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.587331 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:40 crc kubenswrapper[4961]: I0120 11:06:40.587786 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.542513 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.544565 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.548018 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.550463 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.558565 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.558597 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:41 crc kubenswrapper[4961]: E0120 11:06:41.559046 4961 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.559838 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:41 crc kubenswrapper[4961]: W0120 11:06:41.589045 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fd27f75407879522b24b2b8ac7991564438a93ea2d141cf5767c3880c65c5638 WatchSource:0}: Error finding container fd27f75407879522b24b2b8ac7991564438a93ea2d141cf5767c3880c65c5638: Status 404 returned error can't find the container with id fd27f75407879522b24b2b8ac7991564438a93ea2d141cf5767c3880c65c5638 Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.595268 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.595334 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa53a55846bf76c823337eced0cbef70e389a9dfab7d87ce78931fec695fa4e4"} Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.596300 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.596783 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:41 crc kubenswrapper[4961]: I0120 11:06:41.597324 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.605504 4961 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5ea62eeea8ee1273cd8455f002381958b10ba84389e913937e8ce0067773bcbe" exitCode=0 Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.605637 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5ea62eeea8ee1273cd8455f002381958b10ba84389e913937e8ce0067773bcbe"} Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.605770 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fd27f75407879522b24b2b8ac7991564438a93ea2d141cf5767c3880c65c5638"} Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.605986 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.606000 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:42 crc kubenswrapper[4961]: E0120 11:06:42.606685 4961 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.606923 4961 status_manager.go:851] "Failed to get status for pod" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" pod="openshift-authentication/oauth-openshift-558db77b4-jjg29" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-jjg29\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.607485 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:42 crc kubenswrapper[4961]: I0120 11:06:42.608032 4961 status_manager.go:851] "Failed to get status for pod" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.241:6443: connect: connection refused" Jan 20 11:06:43 crc kubenswrapper[4961]: E0120 11:06:43.019109 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.241:6443: connect: connection refused" interval="7s" Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.554754 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.555198 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.555253 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.613934 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bee5b2a6c31e113cc87337789b8b439f402529e6c1820282b02a226743f3d5bb"} Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.613982 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"89431931cdc5073659b5edb74186880636e7d9595494a522de74f263b47f7deb"} Jan 20 11:06:43 crc kubenswrapper[4961]: I0120 11:06:43.613999 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6ace818bc0aa333182fbf50b4745b441aa3a5c1fb48a21a83641710e1eb725fb"} Jan 20 11:06:44 crc kubenswrapper[4961]: I0120 11:06:44.621836 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"97d3e79d660e966da54aa10497fa4bd05024444ca966f3b73e2a9ebac0e69934"} Jan 20 11:06:44 crc kubenswrapper[4961]: I0120 11:06:44.622217 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b63d094d107e6249544f1892ec29efbf0c86a9f78166febb088af0156c30c576"} Jan 20 11:06:44 crc kubenswrapper[4961]: I0120 11:06:44.622238 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:44 crc kubenswrapper[4961]: I0120 11:06:44.622138 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:44 crc kubenswrapper[4961]: I0120 11:06:44.622259 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:46 crc kubenswrapper[4961]: I0120 11:06:46.560082 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:46 crc kubenswrapper[4961]: I0120 11:06:46.560431 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:46 crc kubenswrapper[4961]: I0120 11:06:46.565494 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:48 crc kubenswrapper[4961]: I0120 11:06:48.533309 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:06:49 crc kubenswrapper[4961]: I0120 11:06:49.633205 4961 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:49 crc kubenswrapper[4961]: I0120 11:06:49.655169 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:49 crc kubenswrapper[4961]: I0120 11:06:49.655197 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:49 crc kubenswrapper[4961]: I0120 11:06:49.660171 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:06:49 crc kubenswrapper[4961]: I0120 11:06:49.757904 4961 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ba4f9e2f-37c9-4719-8e34-6053a8b2980b" Jan 20 11:06:50 crc kubenswrapper[4961]: I0120 11:06:50.661709 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:50 crc kubenswrapper[4961]: I0120 11:06:50.662252 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b8d950b7-2b32-443f-b6ea-67115df80c62" Jan 20 11:06:50 crc kubenswrapper[4961]: I0120 11:06:50.664941 4961 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ba4f9e2f-37c9-4719-8e34-6053a8b2980b" Jan 20 11:06:53 crc kubenswrapper[4961]: I0120 11:06:53.554214 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 20 11:06:53 crc kubenswrapper[4961]: I0120 11:06:53.554573 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.010419 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.020514 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.248560 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.373604 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.678864 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 11:06:59 crc kubenswrapper[4961]: I0120 11:06:59.899562 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.119804 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.294436 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.538332 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.547563 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.903757 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 11:07:00 crc kubenswrapper[4961]: I0120 11:07:00.918317 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.090689 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.093745 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.175341 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.175409 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.395735 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.420864 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.424895 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.723397 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.723497 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:07:01 crc kubenswrapper[4961]: I0120 11:07:01.753626 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.026404 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.047769 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.072142 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.118911 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.120043 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.482015 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.485187 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.505179 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.557604 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.781936 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 11:07:02 crc kubenswrapper[4961]: I0120 11:07:02.869484 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.012143 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.339532 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.340408 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.341502 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.341891 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.344381 4961 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.346792 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.347674 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.350780 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.352300 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.363903 4961 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.368007 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jjg29","openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.368084 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.376003 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.404502 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.416736 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.416717997 podStartE2EDuration="14.416717997s" podCreationTimestamp="2026-01-20 11:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:03.415350301 +0000 UTC m=+176.199850172" watchObservedRunningTime="2026-01-20 11:07:03.416717997 +0000 UTC m=+176.201217868" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.454018 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.543534 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.546001 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" path="/var/lib/kubelet/pods/82cc64f1-0377-43e9-94a0-213d82b4a415/volumes" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.554034 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.554123 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.554177 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.554699 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"aa53a55846bf76c823337eced0cbef70e389a9dfab7d87ce78931fec695fa4e4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.554841 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://aa53a55846bf76c823337eced0cbef70e389a9dfab7d87ce78931fec695fa4e4" gracePeriod=30 Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.611826 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.632310 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.694731 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.762585 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.837636 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.872545 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.980629 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 11:07:03 crc kubenswrapper[4961]: I0120 11:07:03.996136 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.081876 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.122134 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.191385 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.219026 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.309896 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.344114 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.422014 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.668635 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.686965 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.716227 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.726503 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.810182 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.930817 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 11:07:04 crc kubenswrapper[4961]: I0120 11:07:04.996874 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.045641 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.091082 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.173685 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.175973 4961 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.202273 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.240287 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.313755 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.340918 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.452374 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.454998 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.530686 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.541055 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.545103 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.670272 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.673538 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.687676 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.701468 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.720118 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.906714 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 11:07:05 crc kubenswrapper[4961]: I0120 11:07:05.959694 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.140201 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.206499 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.254895 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.405905 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.447775 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.463048 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.550490 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.684561 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.749863 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.772629 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.852199 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.872894 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.882397 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.883287 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.899353 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 11:07:06 crc kubenswrapper[4961]: I0120 11:07:06.970339 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.003925 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.056351 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.226626 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.227755 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.250117 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.283585 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.302740 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.311627 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.347243 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.389230 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.392209 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.397402 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.411041 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.500503 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.685263 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.707826 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.716719 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.761147 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.789048 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.855757 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.908123 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.934584 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 11:07:07 crc kubenswrapper[4961]: I0120 11:07:07.985805 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.042678 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.043734 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.067308 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.072357 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.140700 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.145860 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.162840 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.165118 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.254909 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.255671 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.291890 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.298929 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.346960 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.429110 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.468434 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.493422 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.494190 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.573537 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.590034 4961 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.626894 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.666320 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.732347 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.774711 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.813938 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.868813 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.888339 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 11:07:08 crc kubenswrapper[4961]: I0120 11:07:08.957556 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.003476 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.065941 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.071913 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.104445 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.224419 4961 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.284520 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.306368 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.332839 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.447992 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.566470 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.567356 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.674420 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.835027 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.914882 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 11:07:09 crc kubenswrapper[4961]: I0120 11:07:09.987322 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.061542 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.114620 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.167503 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.173470 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.243022 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.291571 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.333932 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.355981 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.359703 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.420207 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.486157 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.549169 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.586539 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.671541 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.712012 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.801225 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.854924 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 11:07:10 crc kubenswrapper[4961]: I0120 11:07:10.978651 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.018361 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.044047 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.092241 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.134868 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.183283 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.211791 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.285320 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.294581 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.336402 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.416267 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.531767 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.534407 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.590600 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.635468 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.682505 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-j624v"] Jan 20 11:07:11 crc kubenswrapper[4961]: E0120 11:07:11.683464 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" containerName="installer" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.683497 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" containerName="installer" Jan 20 11:07:11 crc kubenswrapper[4961]: E0120 11:07:11.683509 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" containerName="oauth-openshift" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.683518 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" containerName="oauth-openshift" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.683695 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffc62a1-dc68-4a34-8ebf-11662a07f343" containerName="installer" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.683725 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="82cc64f1-0377-43e9-94a0-213d82b4a415" containerName="oauth-openshift" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.684405 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.686502 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.687146 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.687344 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.687473 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.687933 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.688097 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.688167 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.690799 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.691344 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.691514 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.691907 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.692186 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.696961 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-j624v"] Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.701290 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.713390 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.727445 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751729 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751776 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-dir\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751795 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751811 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751919 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.751987 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752043 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752103 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-policies\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752157 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752193 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752250 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752287 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752313 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.752352 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bck4z\" (UniqueName: \"kubernetes.io/projected/24e048b6-45e1-41f0-ad2f-bef68af46750-kube-api-access-bck4z\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.810960 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.819661 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.853472 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.853534 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-policies\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.853562 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.853589 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.853621 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854337 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854377 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854403 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bck4z\" (UniqueName: \"kubernetes.io/projected/24e048b6-45e1-41f0-ad2f-bef68af46750-kube-api-access-bck4z\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854441 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854468 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-dir\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854492 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854519 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854554 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854587 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.854843 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.855240 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-dir\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.855615 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.855648 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.857930 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24e048b6-45e1-41f0-ad2f-bef68af46750-audit-policies\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.859196 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.859307 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.859761 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.859773 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.859959 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.860109 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.869777 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.872440 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bck4z\" (UniqueName: \"kubernetes.io/projected/24e048b6-45e1-41f0-ad2f-bef68af46750-kube-api-access-bck4z\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.872542 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/24e048b6-45e1-41f0-ad2f-bef68af46750-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-j624v\" (UID: \"24e048b6-45e1-41f0-ad2f-bef68af46750\") " pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:11 crc kubenswrapper[4961]: I0120 11:07:11.976904 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.009804 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.013656 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.116867 4961 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.117451 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e" gracePeriod=5 Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.136318 4961 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.236584 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-j624v"] Jan 20 11:07:12 crc kubenswrapper[4961]: W0120 11:07:12.247453 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e048b6_45e1_41f0_ad2f_bef68af46750.slice/crio-600ccec6ed5a333dc85f73e89d3bb936e5076bee9ddadd35ede72f8dd304753e WatchSource:0}: Error finding container 600ccec6ed5a333dc85f73e89d3bb936e5076bee9ddadd35ede72f8dd304753e: Status 404 returned error can't find the container with id 600ccec6ed5a333dc85f73e89d3bb936e5076bee9ddadd35ede72f8dd304753e Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.292687 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.371522 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.377372 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" event={"ID":"24e048b6-45e1-41f0-ad2f-bef68af46750","Type":"ContainerStarted","Data":"600ccec6ed5a333dc85f73e89d3bb936e5076bee9ddadd35ede72f8dd304753e"} Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.382381 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.389755 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.435882 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.552539 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.706949 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 11:07:12 crc kubenswrapper[4961]: I0120 11:07:12.902891 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.010407 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.049983 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.119694 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.127091 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.228052 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.362864 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.384987 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" event={"ID":"24e048b6-45e1-41f0-ad2f-bef68af46750","Type":"ContainerStarted","Data":"d3c2a68924498e42feba0f1809b28df0044a1caa8487c69a8fa3130132d01513"} Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.385240 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.390470 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.402079 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5844cf768-j624v" podStartSLOduration=63.402045482 podStartE2EDuration="1m3.402045482s" podCreationTimestamp="2026-01-20 11:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:13.401505858 +0000 UTC m=+186.186005769" watchObservedRunningTime="2026-01-20 11:07:13.402045482 +0000 UTC m=+186.186545353" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.687419 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.773469 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 11:07:13 crc kubenswrapper[4961]: I0120 11:07:13.804434 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.052295 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.063033 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.188720 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.309594 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.330545 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.423537 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.515828 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 11:07:14 crc kubenswrapper[4961]: I0120 11:07:14.603235 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 11:07:15 crc kubenswrapper[4961]: I0120 11:07:15.271473 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 11:07:15 crc kubenswrapper[4961]: I0120 11:07:15.394830 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.257991 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.258112 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331445 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331484 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331522 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331566 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331588 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331665 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331710 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331779 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.331813 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.332031 4961 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.332381 4961 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.332394 4961 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.332402 4961 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.341133 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.405620 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.405665 4961 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e" exitCode=137 Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.405704 4961 scope.go:117] "RemoveContainer" containerID="82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.405793 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.422554 4961 scope.go:117] "RemoveContainer" containerID="82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e" Jan 20 11:07:17 crc kubenswrapper[4961]: E0120 11:07:17.422952 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e\": container with ID starting with 82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e not found: ID does not exist" containerID="82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.422989 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e"} err="failed to get container status \"82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e\": rpc error: code = NotFound desc = could not find container \"82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e\": container with ID starting with 82e497f5506cdf464204b7e7e630965aab4b30b06b6f0c608313177efa4f300e not found: ID does not exist" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.434005 4961 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:17 crc kubenswrapper[4961]: I0120 11:07:17.545288 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 20 11:07:24 crc kubenswrapper[4961]: I0120 11:07:24.546993 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 11:07:28 crc kubenswrapper[4961]: I0120 11:07:28.030710 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 11:07:29 crc kubenswrapper[4961]: I0120 11:07:29.538581 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 11:07:31 crc kubenswrapper[4961]: I0120 11:07:31.722627 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:07:31 crc kubenswrapper[4961]: I0120 11:07:31.722681 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:07:31 crc kubenswrapper[4961]: I0120 11:07:31.722719 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:07:31 crc kubenswrapper[4961]: I0120 11:07:31.723209 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd"} pod="openshift-machine-config-operator/machine-config-daemon-48nk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 11:07:31 crc kubenswrapper[4961]: I0120 11:07:31.723261 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" containerID="cri-o://6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd" gracePeriod=600 Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.495669 4961 generic.go:334] "Generic (PLEG): container finished" podID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerID="6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd" exitCode=0 Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.495765 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerDied","Data":"6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd"} Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.496186 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234"} Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.537330 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.537549 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" podUID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" containerName="controller-manager" containerID="cri-o://da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847" gracePeriod=30 Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.633278 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.633809 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerName="route-controller-manager" containerID="cri-o://a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192" gracePeriod=30 Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.868598 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.985571 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert\") pod \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.985649 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config\") pod \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.985694 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f99ff\" (UniqueName: \"kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff\") pod \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.985726 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles\") pod \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.985757 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca\") pod \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\" (UID: \"78a4572e-93b5-40eb-b11c-98a39f3c6a5b\") " Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.986488 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "78a4572e-93b5-40eb-b11c-98a39f3c6a5b" (UID: "78a4572e-93b5-40eb-b11c-98a39f3c6a5b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.986584 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config" (OuterVolumeSpecName: "config") pod "78a4572e-93b5-40eb-b11c-98a39f3c6a5b" (UID: "78a4572e-93b5-40eb-b11c-98a39f3c6a5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.986729 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "78a4572e-93b5-40eb-b11c-98a39f3c6a5b" (UID: "78a4572e-93b5-40eb-b11c-98a39f3c6a5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.989610 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:32 crc kubenswrapper[4961]: E0120 11:07:32.989828 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.989845 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 11:07:32 crc kubenswrapper[4961]: E0120 11:07:32.989860 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" containerName="controller-manager" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.989866 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" containerName="controller-manager" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.989951 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" containerName="controller-manager" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.989964 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.990348 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.997049 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78a4572e-93b5-40eb-b11c-98a39f3c6a5b" (UID: "78a4572e-93b5-40eb-b11c-98a39f3c6a5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:07:32 crc kubenswrapper[4961]: I0120 11:07:32.997147 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff" (OuterVolumeSpecName: "kube-api-access-f99ff") pod "78a4572e-93b5-40eb-b11c-98a39f3c6a5b" (UID: "78a4572e-93b5-40eb-b11c-98a39f3c6a5b"). InnerVolumeSpecName "kube-api-access-f99ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.012670 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.067912 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.087650 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.087722 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9kc\" (UniqueName: \"kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.087757 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.087805 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.087910 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.088054 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.088088 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.088099 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f99ff\" (UniqueName: \"kubernetes.io/projected/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-kube-api-access-f99ff\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.088110 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.088127 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4572e-93b5-40eb-b11c-98a39f3c6a5b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189192 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config\") pod \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189248 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca\") pod \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189320 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert\") pod \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189456 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st6fv\" (UniqueName: \"kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv\") pod \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\" (UID: \"51f20be1-6fa3-47ce-ac42-6d9a618ae151\") " Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189620 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189687 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189734 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9kc\" (UniqueName: \"kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189767 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.189797 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.190218 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config" (OuterVolumeSpecName: "config") pod "51f20be1-6fa3-47ce-ac42-6d9a618ae151" (UID: "51f20be1-6fa3-47ce-ac42-6d9a618ae151"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.190861 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca" (OuterVolumeSpecName: "client-ca") pod "51f20be1-6fa3-47ce-ac42-6d9a618ae151" (UID: "51f20be1-6fa3-47ce-ac42-6d9a618ae151"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.191261 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.191672 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.191727 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.193755 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51f20be1-6fa3-47ce-ac42-6d9a618ae151" (UID: "51f20be1-6fa3-47ce-ac42-6d9a618ae151"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.193904 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.193884 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv" (OuterVolumeSpecName: "kube-api-access-st6fv") pod "51f20be1-6fa3-47ce-ac42-6d9a618ae151" (UID: "51f20be1-6fa3-47ce-ac42-6d9a618ae151"). InnerVolumeSpecName "kube-api-access-st6fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.205806 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9kc\" (UniqueName: \"kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc\") pod \"controller-manager-6446dffbbd-kz4d2\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.291822 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st6fv\" (UniqueName: \"kubernetes.io/projected/51f20be1-6fa3-47ce-ac42-6d9a618ae151-kube-api-access-st6fv\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.291887 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.291897 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51f20be1-6fa3-47ce-ac42-6d9a618ae151-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.291913 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51f20be1-6fa3-47ce-ac42-6d9a618ae151-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.361359 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.508112 4961 generic.go:334] "Generic (PLEG): container finished" podID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" containerID="da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847" exitCode=0 Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.508181 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" event={"ID":"78a4572e-93b5-40eb-b11c-98a39f3c6a5b","Type":"ContainerDied","Data":"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847"} Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.508199 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.508632 4961 scope.go:117] "RemoveContainer" containerID="da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.508612 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dj2tn" event={"ID":"78a4572e-93b5-40eb-b11c-98a39f3c6a5b","Type":"ContainerDied","Data":"5b2381336084ee6b4c58ab7166919218759af4081282f2f056d26b6af7593249"} Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.510141 4961 generic.go:334] "Generic (PLEG): container finished" podID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerID="a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192" exitCode=0 Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.510181 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" event={"ID":"51f20be1-6fa3-47ce-ac42-6d9a618ae151","Type":"ContainerDied","Data":"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192"} Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.510209 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" event={"ID":"51f20be1-6fa3-47ce-ac42-6d9a618ae151","Type":"ContainerDied","Data":"8f6b3e2f3e67fb812a02bf97ff754919949fc30ff38821f046abec0d4c6d81df"} Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.510265 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.538336 4961 scope.go:117] "RemoveContainer" containerID="da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847" Jan 20 11:07:33 crc kubenswrapper[4961]: E0120 11:07:33.539053 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847\": container with ID starting with da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847 not found: ID does not exist" containerID="da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.539101 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847"} err="failed to get container status \"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847\": rpc error: code = NotFound desc = could not find container \"da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847\": container with ID starting with da24b44cbf75971cd5e43f8b57d4fea592fd88e9cea46653ab4654661abeb847 not found: ID does not exist" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.539149 4961 scope.go:117] "RemoveContainer" containerID="a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.551844 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.551975 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86klf"] Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.556901 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.560973 4961 scope.go:117] "RemoveContainer" containerID="a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192" Jan 20 11:07:33 crc kubenswrapper[4961]: E0120 11:07:33.561379 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192\": container with ID starting with a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192 not found: ID does not exist" containerID="a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.561428 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192"} err="failed to get container status \"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192\": rpc error: code = NotFound desc = could not find container \"a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192\": container with ID starting with a1c3d4775e9e4a9e3dc281e4eba5ee818336421da5b88f986e4c066eb4998192 not found: ID does not exist" Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.562584 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dj2tn"] Jan 20 11:07:33 crc kubenswrapper[4961]: I0120 11:07:33.569731 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.351509 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:34 crc kubenswrapper[4961]: E0120 11:07:34.352695 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerName="route-controller-manager" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.352715 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerName="route-controller-manager" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.352859 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" containerName="route-controller-manager" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.353498 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.360128 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.360307 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.360475 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.360645 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.360913 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.361128 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.365663 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.511538 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4vc\" (UniqueName: \"kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.511633 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.511657 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.511704 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.520118 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.521855 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.521909 4961 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="aa53a55846bf76c823337eced0cbef70e389a9dfab7d87ce78931fec695fa4e4" exitCode=137 Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.522031 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"aa53a55846bf76c823337eced0cbef70e389a9dfab7d87ce78931fec695fa4e4"} Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.522149 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dfdb3210a47d2546605c975a3169c18f919c9434df84f19856117a99e5f9f834"} Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.522180 4961 scope.go:117] "RemoveContainer" containerID="315a88354b4b31aef2ff2afed632ace0acfd5b7bfd5f822722f82ad407c88bce" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.536808 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" event={"ID":"b6aebb4a-5794-41ec-9084-2a0fc9103b58","Type":"ContainerStarted","Data":"ebd2de122e6aaa17cd8a0c213b7152b98e0a155ee6742e140e493f8519a40869"} Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.536855 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" event={"ID":"b6aebb4a-5794-41ec-9084-2a0fc9103b58","Type":"ContainerStarted","Data":"2bc1b0ec94cef7b8fc2a4e81fdaf2963ee0627e736f827170608247b6b1f34cf"} Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.538249 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.543305 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.565836 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" podStartSLOduration=2.56581791 podStartE2EDuration="2.56581791s" podCreationTimestamp="2026-01-20 11:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:34.563700809 +0000 UTC m=+207.348200690" watchObservedRunningTime="2026-01-20 11:07:34.56581791 +0000 UTC m=+207.350317781" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.612885 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4vc\" (UniqueName: \"kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.613201 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.613284 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.613391 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.614258 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.614317 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.619487 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.629749 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4vc\" (UniqueName: \"kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc\") pod \"route-controller-manager-6b586b96dd-b24nr\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.689492 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:34 crc kubenswrapper[4961]: I0120 11:07:34.879478 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.546690 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f20be1-6fa3-47ce-ac42-6d9a618ae151" path="/var/lib/kubelet/pods/51f20be1-6fa3-47ce-ac42-6d9a618ae151/volumes" Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.548384 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a4572e-93b5-40eb-b11c-98a39f3c6a5b" path="/var/lib/kubelet/pods/78a4572e-93b5-40eb-b11c-98a39f3c6a5b/volumes" Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.549487 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.549524 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" event={"ID":"4c4cb271-f0ee-4997-8e87-095f48d6658f","Type":"ContainerStarted","Data":"0b91837dedd88d57c7d0ce088c868d0bf5cc1fe1f034b17d4d31ae2a7e1c7fcf"} Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.549541 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" event={"ID":"4c4cb271-f0ee-4997-8e87-095f48d6658f","Type":"ContainerStarted","Data":"db03761a76c32557ff0c908dc3b9ac763fc9801dd71467bb95ba323dab2f27b4"} Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.552266 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.552274 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:35 crc kubenswrapper[4961]: I0120 11:07:35.566993 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" podStartSLOduration=3.566965309 podStartE2EDuration="3.566965309s" podCreationTimestamp="2026-01-20 11:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:35.563503804 +0000 UTC m=+208.348003685" watchObservedRunningTime="2026-01-20 11:07:35.566965309 +0000 UTC m=+208.351465180" Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.576048 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.576715 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcphf" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="registry-server" containerID="cri-o://9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9" gracePeriod=30 Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.586775 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.587402 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7p747" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="registry-server" containerID="cri-o://a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d" gracePeriod=30 Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.592325 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.592570 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" containerID="cri-o://c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953" gracePeriod=30 Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.596189 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.596610 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-675m8" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="registry-server" containerID="cri-o://e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc" gracePeriod=30 Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.612921 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:07:37 crc kubenswrapper[4961]: I0120 11:07:37.613161 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h2djc" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="registry-server" containerID="cri-o://52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef" gracePeriod=30 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.068705 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.089766 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dph7\" (UniqueName: \"kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7\") pod \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.089821 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities\") pod \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.089879 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content\") pod \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\" (UID: \"3449c15e-8212-40ed-85f5-37a0f79fd9e4\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.093257 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities" (OuterVolumeSpecName: "utilities") pod "3449c15e-8212-40ed-85f5-37a0f79fd9e4" (UID: "3449c15e-8212-40ed-85f5-37a0f79fd9e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.111697 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7" (OuterVolumeSpecName: "kube-api-access-2dph7") pod "3449c15e-8212-40ed-85f5-37a0f79fd9e4" (UID: "3449c15e-8212-40ed-85f5-37a0f79fd9e4"). InnerVolumeSpecName "kube-api-access-2dph7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.147357 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.151114 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3449c15e-8212-40ed-85f5-37a0f79fd9e4" (UID: "3449c15e-8212-40ed-85f5-37a0f79fd9e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.160822 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.174897 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.190976 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191016 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqlsp\" (UniqueName: \"kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp\") pod \"4155767c-ce93-427a-9a44-d02d9fa3ac62\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191167 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvpbg\" (UniqueName: \"kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg\") pod \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191201 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities\") pod \"52943ef2-6fee-4910-8dd2-3723b3575824\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191479 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191498 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dph7\" (UniqueName: \"kubernetes.io/projected/3449c15e-8212-40ed-85f5-37a0f79fd9e4-kube-api-access-2dph7\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.191513 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3449c15e-8212-40ed-85f5-37a0f79fd9e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.192378 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities" (OuterVolumeSpecName: "utilities") pod "52943ef2-6fee-4910-8dd2-3723b3575824" (UID: "52943ef2-6fee-4910-8dd2-3723b3575824"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.193538 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp" (OuterVolumeSpecName: "kube-api-access-hqlsp") pod "4155767c-ce93-427a-9a44-d02d9fa3ac62" (UID: "4155767c-ce93-427a-9a44-d02d9fa3ac62"). InnerVolumeSpecName "kube-api-access-hqlsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.194952 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg" (OuterVolumeSpecName: "kube-api-access-hvpbg") pod "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" (UID: "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef"). InnerVolumeSpecName "kube-api-access-hvpbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292469 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2njzn\" (UniqueName: \"kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn\") pod \"221c46d0-ccdb-4e6a-a143-04c3bce55711\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292546 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities\") pod \"4155767c-ce93-427a-9a44-d02d9fa3ac62\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292570 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2v6b\" (UniqueName: \"kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b\") pod \"52943ef2-6fee-4910-8dd2-3723b3575824\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292601 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content\") pod \"4155767c-ce93-427a-9a44-d02d9fa3ac62\" (UID: \"4155767c-ce93-427a-9a44-d02d9fa3ac62\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292633 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content\") pod \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292665 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content\") pod \"52943ef2-6fee-4910-8dd2-3723b3575824\" (UID: \"52943ef2-6fee-4910-8dd2-3723b3575824\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292696 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities\") pod \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\" (UID: \"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292736 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics\") pod \"221c46d0-ccdb-4e6a-a143-04c3bce55711\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292761 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca\") pod \"221c46d0-ccdb-4e6a-a143-04c3bce55711\" (UID: \"221c46d0-ccdb-4e6a-a143-04c3bce55711\") " Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292912 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqlsp\" (UniqueName: \"kubernetes.io/projected/4155767c-ce93-427a-9a44-d02d9fa3ac62-kube-api-access-hqlsp\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292927 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvpbg\" (UniqueName: \"kubernetes.io/projected/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-kube-api-access-hvpbg\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.292939 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.293675 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities" (OuterVolumeSpecName: "utilities") pod "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" (UID: "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.293797 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities" (OuterVolumeSpecName: "utilities") pod "4155767c-ce93-427a-9a44-d02d9fa3ac62" (UID: "4155767c-ce93-427a-9a44-d02d9fa3ac62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.294539 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "221c46d0-ccdb-4e6a-a143-04c3bce55711" (UID: "221c46d0-ccdb-4e6a-a143-04c3bce55711"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.296789 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "221c46d0-ccdb-4e6a-a143-04c3bce55711" (UID: "221c46d0-ccdb-4e6a-a143-04c3bce55711"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.299855 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn" (OuterVolumeSpecName: "kube-api-access-2njzn") pod "221c46d0-ccdb-4e6a-a143-04c3bce55711" (UID: "221c46d0-ccdb-4e6a-a143-04c3bce55711"). InnerVolumeSpecName "kube-api-access-2njzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.300869 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b" (OuterVolumeSpecName: "kube-api-access-t2v6b") pod "52943ef2-6fee-4910-8dd2-3723b3575824" (UID: "52943ef2-6fee-4910-8dd2-3723b3575824"). InnerVolumeSpecName "kube-api-access-t2v6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.314490 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52943ef2-6fee-4910-8dd2-3723b3575824" (UID: "52943ef2-6fee-4910-8dd2-3723b3575824"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.342788 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4155767c-ce93-427a-9a44-d02d9fa3ac62" (UID: "4155767c-ce93-427a-9a44-d02d9fa3ac62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394778 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52943ef2-6fee-4910-8dd2-3723b3575824-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394821 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394833 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394845 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/221c46d0-ccdb-4e6a-a143-04c3bce55711-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394854 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2njzn\" (UniqueName: \"kubernetes.io/projected/221c46d0-ccdb-4e6a-a143-04c3bce55711-kube-api-access-2njzn\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394862 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394871 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2v6b\" (UniqueName: \"kubernetes.io/projected/52943ef2-6fee-4910-8dd2-3723b3575824-kube-api-access-t2v6b\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.394881 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4155767c-ce93-427a-9a44-d02d9fa3ac62-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.446265 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" (UID: "03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.496685 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.533215 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.572213 4961 generic.go:334] "Generic (PLEG): container finished" podID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerID="9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9" exitCode=0 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.572286 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerDied","Data":"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.572354 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcphf" event={"ID":"3449c15e-8212-40ed-85f5-37a0f79fd9e4","Type":"ContainerDied","Data":"3694d3b9303e0a6876c6eedabc3495c7796b4724e5a0915b7534c0c86e1a363d"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.572376 4961 scope.go:117] "RemoveContainer" containerID="9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.572376 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcphf" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.578795 4961 generic.go:334] "Generic (PLEG): container finished" podID="52943ef2-6fee-4910-8dd2-3723b3575824" containerID="e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc" exitCode=0 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.578961 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerDied","Data":"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.579015 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-675m8" event={"ID":"52943ef2-6fee-4910-8dd2-3723b3575824","Type":"ContainerDied","Data":"eb7b8a712cf9ec2441c725e853b77acb524183007d51d1a80ca2223eba6e8395"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.579180 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-675m8" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.582496 4961 generic.go:334] "Generic (PLEG): container finished" podID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerID="c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953" exitCode=0 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.582593 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" event={"ID":"221c46d0-ccdb-4e6a-a143-04c3bce55711","Type":"ContainerDied","Data":"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.582628 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" event={"ID":"221c46d0-ccdb-4e6a-a143-04c3bce55711","Type":"ContainerDied","Data":"299c5de65e96cad27fe4afa404259182ea95159dc51fe9410a74117eb96ac2d0"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.582630 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bslr7" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.591643 4961 generic.go:334] "Generic (PLEG): container finished" podID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerID="a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d" exitCode=0 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.591809 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerDied","Data":"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.591841 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7p747" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.591864 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7p747" event={"ID":"4155767c-ce93-427a-9a44-d02d9fa3ac62","Type":"ContainerDied","Data":"36460b7aaa5849565684ffe3db3c944519dc840739ad685976f686d5776ab003"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.596104 4961 scope.go:117] "RemoveContainer" containerID="983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.600379 4961 generic.go:334] "Generic (PLEG): container finished" podID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerID="52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef" exitCode=0 Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.600439 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerDied","Data":"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.600483 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h2djc" event={"ID":"03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef","Type":"ContainerDied","Data":"ead4b0c5adc9712208f8252246af9c6ab7ed282ec2ae149cfca2574d57e744d1"} Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.600446 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h2djc" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.630711 4961 scope.go:117] "RemoveContainer" containerID="39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.648913 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.657265 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bslr7"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.669451 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.670047 4961 scope.go:117] "RemoveContainer" containerID="9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.670534 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9\": container with ID starting with 9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9 not found: ID does not exist" containerID="9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.670610 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9"} err="failed to get container status \"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9\": rpc error: code = NotFound desc = could not find container \"9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9\": container with ID starting with 9987973e3b02d36434ad467f5f731e314a091f54fa5e2497a3d42e709e6359e9 not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.670657 4961 scope.go:117] "RemoveContainer" containerID="983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.671326 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef\": container with ID starting with 983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef not found: ID does not exist" containerID="983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.671370 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef"} err="failed to get container status \"983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef\": rpc error: code = NotFound desc = could not find container \"983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef\": container with ID starting with 983f272a7e07d9df9f75c1a26238cb18c7b7f8067d906c6452b26fb3e6c2cbef not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.671402 4961 scope.go:117] "RemoveContainer" containerID="39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.671773 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61\": container with ID starting with 39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61 not found: ID does not exist" containerID="39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.671810 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61"} err="failed to get container status \"39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61\": rpc error: code = NotFound desc = could not find container \"39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61\": container with ID starting with 39be0bf829671ab76bdd882eee4ce9d31d99d765ba17ed7e90e97eaf5c0fcd61 not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.671839 4961 scope.go:117] "RemoveContainer" containerID="e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.680651 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h2djc"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.685906 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.691432 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcphf"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.702744 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.702811 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7p747"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.708018 4961 scope.go:117] "RemoveContainer" containerID="eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.709417 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.713616 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-675m8"] Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.725329 4961 scope.go:117] "RemoveContainer" containerID="dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.743379 4961 scope.go:117] "RemoveContainer" containerID="e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.744407 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc\": container with ID starting with e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc not found: ID does not exist" containerID="e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.744569 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc"} err="failed to get container status \"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc\": rpc error: code = NotFound desc = could not find container \"e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc\": container with ID starting with e1c02fe503f3dd6dea2e13339cb024dd72f437646290a4a9a5995e7deb1808bc not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.744687 4961 scope.go:117] "RemoveContainer" containerID="eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.746566 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed\": container with ID starting with eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed not found: ID does not exist" containerID="eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.746621 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed"} err="failed to get container status \"eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed\": rpc error: code = NotFound desc = could not find container \"eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed\": container with ID starting with eaaa3763085652778605d1c770f69efc257c579aca3c2d1e3ef7696cd35dfeed not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.746705 4961 scope.go:117] "RemoveContainer" containerID="dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.747163 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea\": container with ID starting with dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea not found: ID does not exist" containerID="dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.747194 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea"} err="failed to get container status \"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea\": rpc error: code = NotFound desc = could not find container \"dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea\": container with ID starting with dcb0ae588665ea4664fa21aec906b834140b5698b935a21193ed9be87e8018ea not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.747215 4961 scope.go:117] "RemoveContainer" containerID="c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.802474 4961 scope.go:117] "RemoveContainer" containerID="c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.803289 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953\": container with ID starting with c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953 not found: ID does not exist" containerID="c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.803327 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953"} err="failed to get container status \"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953\": rpc error: code = NotFound desc = could not find container \"c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953\": container with ID starting with c34cc118527596dbc2be1b8a8fe04864e553e30873c8f0c0bed91b3b29ef4953 not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.803355 4961 scope.go:117] "RemoveContainer" containerID="a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.824245 4961 scope.go:117] "RemoveContainer" containerID="74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.846490 4961 scope.go:117] "RemoveContainer" containerID="d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.865629 4961 scope.go:117] "RemoveContainer" containerID="a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.866474 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d\": container with ID starting with a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d not found: ID does not exist" containerID="a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.866523 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d"} err="failed to get container status \"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d\": rpc error: code = NotFound desc = could not find container \"a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d\": container with ID starting with a30e9c9dd5df963c853613fc5a0081bfc456cfe12d50f6eedb41cfdd516fec6d not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.866554 4961 scope.go:117] "RemoveContainer" containerID="74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.866945 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304\": container with ID starting with 74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304 not found: ID does not exist" containerID="74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.867025 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304"} err="failed to get container status \"74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304\": rpc error: code = NotFound desc = could not find container \"74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304\": container with ID starting with 74a5a48c651ba32371b67994c5dafd4c4b050b89c5bbd1d7435f2ff602a51304 not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.867103 4961 scope.go:117] "RemoveContainer" containerID="d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.867449 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a\": container with ID starting with d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a not found: ID does not exist" containerID="d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.867476 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a"} err="failed to get container status \"d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a\": rpc error: code = NotFound desc = could not find container \"d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a\": container with ID starting with d273b16dd07814f8a8f81b9cf8c689f0bc26802ab7471922fb4a114a0d40822a not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.867492 4961 scope.go:117] "RemoveContainer" containerID="52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.883456 4961 scope.go:117] "RemoveContainer" containerID="0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.902682 4961 scope.go:117] "RemoveContainer" containerID="cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.926497 4961 scope.go:117] "RemoveContainer" containerID="52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.927413 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef\": container with ID starting with 52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef not found: ID does not exist" containerID="52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.927534 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef"} err="failed to get container status \"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef\": rpc error: code = NotFound desc = could not find container \"52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef\": container with ID starting with 52619eb6a5e5d75703b016d88aab8b70c37ae836cfac1c2b88dc58d7e8dfa5ef not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.927635 4961 scope.go:117] "RemoveContainer" containerID="0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.928241 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3\": container with ID starting with 0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3 not found: ID does not exist" containerID="0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.928276 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3"} err="failed to get container status \"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3\": rpc error: code = NotFound desc = could not find container \"0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3\": container with ID starting with 0fd7392a69ec0622a61117714c5ce95be26e0a40cb8dbf1d3c5cf66315c0aad3 not found: ID does not exist" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.928308 4961 scope.go:117] "RemoveContainer" containerID="cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0" Jan 20 11:07:38 crc kubenswrapper[4961]: E0120 11:07:38.928683 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0\": container with ID starting with cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0 not found: ID does not exist" containerID="cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0" Jan 20 11:07:38 crc kubenswrapper[4961]: I0120 11:07:38.928741 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0"} err="failed to get container status \"cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0\": rpc error: code = NotFound desc = could not find container \"cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0\": container with ID starting with cfa126d8eee7b0123aaca3bc28a38d5ae9dcebf829e885c71d533b8a71f810c0 not found: ID does not exist" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.545124 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" path="/var/lib/kubelet/pods/03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef/volumes" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.546271 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" path="/var/lib/kubelet/pods/221c46d0-ccdb-4e6a-a143-04c3bce55711/volumes" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.546844 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" path="/var/lib/kubelet/pods/3449c15e-8212-40ed-85f5-37a0f79fd9e4/volumes" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.548168 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" path="/var/lib/kubelet/pods/4155767c-ce93-427a-9a44-d02d9fa3ac62/volumes" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.548886 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" path="/var/lib/kubelet/pods/52943ef2-6fee-4910-8dd2-3723b3575824/volumes" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611356 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5t5v"] Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611585 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611601 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611611 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611618 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611633 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611640 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611648 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611655 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611662 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611669 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611678 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611685 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611696 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611703 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611716 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611723 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611752 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611760 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611772 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611781 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611790 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611799 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611809 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611816 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="extract-utilities" Jan 20 11:07:39 crc kubenswrapper[4961]: E0120 11:07:39.611826 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611834 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="extract-content" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611940 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="221c46d0-ccdb-4e6a-a143-04c3bce55711" containerName="marketplace-operator" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611958 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="3449c15e-8212-40ed-85f5-37a0f79fd9e4" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611967 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="03eb8291-8e44-46c5-a4ae-a9e4cc8f39ef" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611976 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="52943ef2-6fee-4910-8dd2-3723b3575824" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.611986 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="4155767c-ce93-427a-9a44-d02d9fa3ac62" containerName="registry-server" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.612786 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.614403 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpxz\" (UniqueName: \"kubernetes.io/projected/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-kube-api-access-rlpxz\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.614483 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-utilities\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.614548 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-catalog-content\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.615526 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.617537 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.617598 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.621251 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5t5v"] Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.715883 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-catalog-content\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.716156 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpxz\" (UniqueName: \"kubernetes.io/projected/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-kube-api-access-rlpxz\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.716239 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-utilities\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.716669 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-catalog-content\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.716705 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-utilities\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.735232 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpxz\" (UniqueName: \"kubernetes.io/projected/3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c-kube-api-access-rlpxz\") pod \"redhat-marketplace-j5t5v\" (UID: \"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c\") " pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:39 crc kubenswrapper[4961]: I0120 11:07:39.927027 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.360336 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5t5v"] Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.621716 4961 generic.go:334] "Generic (PLEG): container finished" podID="3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c" containerID="4e44d57676ad6ab4d4912f965ae873baf03db0fa67a32f80264faed7a9315725" exitCode=0 Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.621817 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5t5v" event={"ID":"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c","Type":"ContainerDied","Data":"4e44d57676ad6ab4d4912f965ae873baf03db0fa67a32f80264faed7a9315725"} Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.622132 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5t5v" event={"ID":"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c","Type":"ContainerStarted","Data":"c592b07150c5e460d8c8ac6ab30ce37bed445c578a0dfadc16c0b3efc6949780"} Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.697361 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.804602 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6wb7h"] Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.805552 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.807415 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.814606 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wb7h"] Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.829972 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-utilities\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.830030 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-catalog-content\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.830054 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/07da8a3f-c7d1-49a2-abe8-f05504fc3887-kube-api-access-n2pkt\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.931181 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-utilities\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.931593 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-catalog-content\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.931627 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/07da8a3f-c7d1-49a2-abe8-f05504fc3887-kube-api-access-n2pkt\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.931642 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-utilities\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.932107 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07da8a3f-c7d1-49a2-abe8-f05504fc3887-catalog-content\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:40 crc kubenswrapper[4961]: I0120 11:07:40.948844 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/07da8a3f-c7d1-49a2-abe8-f05504fc3887-kube-api-access-n2pkt\") pod \"redhat-operators-6wb7h\" (UID: \"07da8a3f-c7d1-49a2-abe8-f05504fc3887\") " pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:41 crc kubenswrapper[4961]: I0120 11:07:41.129657 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:41 crc kubenswrapper[4961]: I0120 11:07:41.558246 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wb7h"] Jan 20 11:07:41 crc kubenswrapper[4961]: W0120 11:07:41.568008 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07da8a3f_c7d1_49a2_abe8_f05504fc3887.slice/crio-fcb138d3ef8e89de9a02e9fce914ef12493444ea781868727f5b5f77cf760144 WatchSource:0}: Error finding container fcb138d3ef8e89de9a02e9fce914ef12493444ea781868727f5b5f77cf760144: Status 404 returned error can't find the container with id fcb138d3ef8e89de9a02e9fce914ef12493444ea781868727f5b5f77cf760144 Jan 20 11:07:41 crc kubenswrapper[4961]: I0120 11:07:41.628354 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wb7h" event={"ID":"07da8a3f-c7d1-49a2-abe8-f05504fc3887","Type":"ContainerStarted","Data":"fcb138d3ef8e89de9a02e9fce914ef12493444ea781868727f5b5f77cf760144"} Jan 20 11:07:41 crc kubenswrapper[4961]: I0120 11:07:41.631205 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5t5v" event={"ID":"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c","Type":"ContainerStarted","Data":"bb50624bb2b804df6c5365f20c17ec132a16a6896e4d6f9d1ac33e6d991da722"} Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.397541 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-97rvg"] Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.399344 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.403620 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.408530 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97rvg"] Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.550797 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-catalog-content\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.551096 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6j8j\" (UniqueName: \"kubernetes.io/projected/a0917618-92fa-4b49-bc76-e95d0f5661e5-kube-api-access-n6j8j\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.551195 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-utilities\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.637747 4961 generic.go:334] "Generic (PLEG): container finished" podID="3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c" containerID="bb50624bb2b804df6c5365f20c17ec132a16a6896e4d6f9d1ac33e6d991da722" exitCode=0 Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.637780 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5t5v" event={"ID":"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c","Type":"ContainerDied","Data":"bb50624bb2b804df6c5365f20c17ec132a16a6896e4d6f9d1ac33e6d991da722"} Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.639142 4961 generic.go:334] "Generic (PLEG): container finished" podID="07da8a3f-c7d1-49a2-abe8-f05504fc3887" containerID="16da96f4b0da8d9193c843fc3ad318ff42ec3d6c3e392d4c6caafe2140543c3f" exitCode=0 Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.639166 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wb7h" event={"ID":"07da8a3f-c7d1-49a2-abe8-f05504fc3887","Type":"ContainerDied","Data":"16da96f4b0da8d9193c843fc3ad318ff42ec3d6c3e392d4c6caafe2140543c3f"} Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.652129 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-utilities\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.652248 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-catalog-content\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.652280 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6j8j\" (UniqueName: \"kubernetes.io/projected/a0917618-92fa-4b49-bc76-e95d0f5661e5-kube-api-access-n6j8j\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.652713 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-utilities\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.652773 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0917618-92fa-4b49-bc76-e95d0f5661e5-catalog-content\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.681187 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6j8j\" (UniqueName: \"kubernetes.io/projected/a0917618-92fa-4b49-bc76-e95d0f5661e5-kube-api-access-n6j8j\") pod \"certified-operators-97rvg\" (UID: \"a0917618-92fa-4b49-bc76-e95d0f5661e5\") " pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:42 crc kubenswrapper[4961]: I0120 11:07:42.717957 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.093759 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97rvg"] Jan 20 11:07:43 crc kubenswrapper[4961]: W0120 11:07:43.097747 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0917618_92fa_4b49_bc76_e95d0f5661e5.slice/crio-12949e920391ec6b4e89a05e01ff5e4ec4b4f35382d03c522b172b549f333926 WatchSource:0}: Error finding container 12949e920391ec6b4e89a05e01ff5e4ec4b4f35382d03c522b172b549f333926: Status 404 returned error can't find the container with id 12949e920391ec6b4e89a05e01ff5e4ec4b4f35382d03c522b172b549f333926 Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.553366 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.557037 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.602987 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qtbnk"] Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.603899 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.605617 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.615614 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtbnk"] Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.649165 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5t5v" event={"ID":"3273c0de-cbe3-45f8-9a3e-fef0b18e4a5c","Type":"ContainerStarted","Data":"a0ecc1dd494d35eed9418c28f15057d88aad35b398e43e071812b395bd046d22"} Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.650730 4961 generic.go:334] "Generic (PLEG): container finished" podID="a0917618-92fa-4b49-bc76-e95d0f5661e5" containerID="6e8f3ee3363e59bd742d69e2d4204c1adc429ee3668c12f311ab69225131d5f4" exitCode=0 Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.650981 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97rvg" event={"ID":"a0917618-92fa-4b49-bc76-e95d0f5661e5","Type":"ContainerDied","Data":"6e8f3ee3363e59bd742d69e2d4204c1adc429ee3668c12f311ab69225131d5f4"} Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.651021 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97rvg" event={"ID":"a0917618-92fa-4b49-bc76-e95d0f5661e5","Type":"ContainerStarted","Data":"12949e920391ec6b4e89a05e01ff5e4ec4b4f35382d03c522b172b549f333926"} Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.668502 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5t5v" podStartSLOduration=1.9474144629999999 podStartE2EDuration="4.668484513s" podCreationTimestamp="2026-01-20 11:07:39 +0000 UTC" firstStartedPulling="2026-01-20 11:07:40.623691961 +0000 UTC m=+213.408191832" lastFinishedPulling="2026-01-20 11:07:43.344762011 +0000 UTC m=+216.129261882" observedRunningTime="2026-01-20 11:07:43.666698119 +0000 UTC m=+216.451197990" watchObservedRunningTime="2026-01-20 11:07:43.668484513 +0000 UTC m=+216.452984384" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.768807 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-catalog-content\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.768863 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxzq\" (UniqueName: \"kubernetes.io/projected/d1882c51-8bab-473c-bd83-fe8427d32470-kube-api-access-xgxzq\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.768915 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-utilities\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.870574 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-catalog-content\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.870653 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgxzq\" (UniqueName: \"kubernetes.io/projected/d1882c51-8bab-473c-bd83-fe8427d32470-kube-api-access-xgxzq\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.870703 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-utilities\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.871369 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-catalog-content\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.871432 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1882c51-8bab-473c-bd83-fe8427d32470-utilities\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.897451 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgxzq\" (UniqueName: \"kubernetes.io/projected/d1882c51-8bab-473c-bd83-fe8427d32470-kube-api-access-xgxzq\") pod \"community-operators-qtbnk\" (UID: \"d1882c51-8bab-473c-bd83-fe8427d32470\") " pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:43 crc kubenswrapper[4961]: I0120 11:07:43.919110 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.330474 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtbnk"] Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.656894 4961 generic.go:334] "Generic (PLEG): container finished" podID="d1882c51-8bab-473c-bd83-fe8427d32470" containerID="3c509626d2e4796f9e74ce08c46672d44ff96218c6c2dad4436ce2d3becf0a4c" exitCode=0 Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.658706 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtbnk" event={"ID":"d1882c51-8bab-473c-bd83-fe8427d32470","Type":"ContainerDied","Data":"3c509626d2e4796f9e74ce08c46672d44ff96218c6c2dad4436ce2d3becf0a4c"} Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.658812 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtbnk" event={"ID":"d1882c51-8bab-473c-bd83-fe8427d32470","Type":"ContainerStarted","Data":"c51da97f333272207b08297677b858f232bbff76656f058b0d041db758def219"} Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.661828 4961 generic.go:334] "Generic (PLEG): container finished" podID="07da8a3f-c7d1-49a2-abe8-f05504fc3887" containerID="c3c77ea1c95ceaa0c693d99e100b42c505e9bdffc825b5b20c3be3e3fbf54830" exitCode=0 Jan 20 11:07:44 crc kubenswrapper[4961]: I0120 11:07:44.661937 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wb7h" event={"ID":"07da8a3f-c7d1-49a2-abe8-f05504fc3887","Type":"ContainerDied","Data":"c3c77ea1c95ceaa0c693d99e100b42c505e9bdffc825b5b20c3be3e3fbf54830"} Jan 20 11:07:45 crc kubenswrapper[4961]: I0120 11:07:45.668258 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wb7h" event={"ID":"07da8a3f-c7d1-49a2-abe8-f05504fc3887","Type":"ContainerStarted","Data":"8ce8ce1f712644ff485723d6c70f4a1c047d6c6d4628f74232363e199fb8a264"} Jan 20 11:07:45 crc kubenswrapper[4961]: I0120 11:07:45.670201 4961 generic.go:334] "Generic (PLEG): container finished" podID="a0917618-92fa-4b49-bc76-e95d0f5661e5" containerID="f48630e18d0217e746289acdbe10500b4da19f78c10e3a5e18a96c35c52fdd83" exitCode=0 Jan 20 11:07:45 crc kubenswrapper[4961]: I0120 11:07:45.670246 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97rvg" event={"ID":"a0917618-92fa-4b49-bc76-e95d0f5661e5","Type":"ContainerDied","Data":"f48630e18d0217e746289acdbe10500b4da19f78c10e3a5e18a96c35c52fdd83"} Jan 20 11:07:45 crc kubenswrapper[4961]: I0120 11:07:45.684161 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6wb7h" podStartSLOduration=2.902805687 podStartE2EDuration="5.684147385s" podCreationTimestamp="2026-01-20 11:07:40 +0000 UTC" firstStartedPulling="2026-01-20 11:07:42.640343618 +0000 UTC m=+215.424843489" lastFinishedPulling="2026-01-20 11:07:45.421685316 +0000 UTC m=+218.206185187" observedRunningTime="2026-01-20 11:07:45.684103464 +0000 UTC m=+218.468603345" watchObservedRunningTime="2026-01-20 11:07:45.684147385 +0000 UTC m=+218.468647256" Jan 20 11:07:46 crc kubenswrapper[4961]: I0120 11:07:46.676651 4961 generic.go:334] "Generic (PLEG): container finished" podID="d1882c51-8bab-473c-bd83-fe8427d32470" containerID="b6ecf12f23a84391b126bb0cedfebefc646d86c88641b5386a8fbfe5a7b48eb5" exitCode=0 Jan 20 11:07:46 crc kubenswrapper[4961]: I0120 11:07:46.677692 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtbnk" event={"ID":"d1882c51-8bab-473c-bd83-fe8427d32470","Type":"ContainerDied","Data":"b6ecf12f23a84391b126bb0cedfebefc646d86c88641b5386a8fbfe5a7b48eb5"} Jan 20 11:07:48 crc kubenswrapper[4961]: I0120 11:07:48.538453 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 11:07:48 crc kubenswrapper[4961]: I0120 11:07:48.688970 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97rvg" event={"ID":"a0917618-92fa-4b49-bc76-e95d0f5661e5","Type":"ContainerStarted","Data":"8fb5b010eaf64749c2a8634522378508e5a8ee5fdca6011d8b20c1cfd118e69e"} Jan 20 11:07:49 crc kubenswrapper[4961]: I0120 11:07:49.927304 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:49 crc kubenswrapper[4961]: I0120 11:07:49.927372 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:49 crc kubenswrapper[4961]: I0120 11:07:49.990471 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:50 crc kubenswrapper[4961]: I0120 11:07:50.747259 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5t5v" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.130437 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.131574 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.256966 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpc4"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.257806 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.261712 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.262396 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.269115 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.273178 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpc4"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.297837 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fsp5c"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.298729 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.305417 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.305643 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerName="controller-manager" containerID="cri-o://ebd2de122e6aaa17cd8a0c213b7152b98e0a155ee6742e140e493f8519a40869" gracePeriod=30 Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.345190 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fsp5c"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.371617 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.371857 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" podUID="4c4cb271-f0ee-4997-8e87-095f48d6658f" containerName="route-controller-manager" containerID="cri-o://0b91837dedd88d57c7d0ce088c868d0bf5cc1fe1f034b17d4d31ae2a7e1c7fcf" gracePeriod=30 Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.379746 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.380145 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvr66\" (UniqueName: \"kubernetes.io/projected/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-kube-api-access-tvr66\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.380220 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481472 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2149a838-06ff-41e7-a655-2a6b51973caf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481531 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-bound-sa-token\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481617 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-registry-tls\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481666 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481750 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvr66\" (UniqueName: \"kubernetes.io/projected/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-kube-api-access-tvr66\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481790 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd2ts\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-kube-api-access-vd2ts\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481880 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481901 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-trusted-ca\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481919 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2149a838-06ff-41e7-a655-2a6b51973caf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481954 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.481995 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-registry-certificates\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.483673 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.493278 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.526668 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvr66\" (UniqueName: \"kubernetes.io/projected/c72d062d-5afc-4b02-996e-0fd02b0ad9d1-kube-api-access-tvr66\") pod \"marketplace-operator-79b997595-6wpc4\" (UID: \"c72d062d-5afc-4b02-996e-0fd02b0ad9d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.578488 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583315 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd2ts\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-kube-api-access-vd2ts\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583380 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-trusted-ca\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583401 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2149a838-06ff-41e7-a655-2a6b51973caf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583433 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-registry-certificates\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583460 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2149a838-06ff-41e7-a655-2a6b51973caf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583482 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-bound-sa-token\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.583506 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-registry-tls\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.584105 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2149a838-06ff-41e7-a655-2a6b51973caf-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.585131 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-trusted-ca\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.585988 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2149a838-06ff-41e7-a655-2a6b51973caf-registry-certificates\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.586317 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2149a838-06ff-41e7-a655-2a6b51973caf-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.588695 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-registry-tls\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.619479 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.622839 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd2ts\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-kube-api-access-vd2ts\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.649554 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2149a838-06ff-41e7-a655-2a6b51973caf-bound-sa-token\") pod \"image-registry-66df7c8f76-fsp5c\" (UID: \"2149a838-06ff-41e7-a655-2a6b51973caf\") " pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:51 crc kubenswrapper[4961]: I0120 11:07:51.928741 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.064579 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpc4"] Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.198883 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6wb7h" podUID="07da8a3f-c7d1-49a2-abe8-f05504fc3887" containerName="registry-server" probeResult="failure" output=< Jan 20 11:07:52 crc kubenswrapper[4961]: timeout: failed to connect service ":50051" within 1s Jan 20 11:07:52 crc kubenswrapper[4961]: > Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.355890 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fsp5c"] Jan 20 11:07:52 crc kubenswrapper[4961]: W0120 11:07:52.359598 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2149a838_06ff_41e7_a655_2a6b51973caf.slice/crio-9ea26da54636f250ffe5ad2c4e40b5291df4f3e389a58a407ae35c70f9e86a82 WatchSource:0}: Error finding container 9ea26da54636f250ffe5ad2c4e40b5291df4f3e389a58a407ae35c70f9e86a82: Status 404 returned error can't find the container with id 9ea26da54636f250ffe5ad2c4e40b5291df4f3e389a58a407ae35c70f9e86a82 Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.708367 4961 generic.go:334] "Generic (PLEG): container finished" podID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerID="ebd2de122e6aaa17cd8a0c213b7152b98e0a155ee6742e140e493f8519a40869" exitCode=0 Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.708463 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" event={"ID":"b6aebb4a-5794-41ec-9084-2a0fc9103b58","Type":"ContainerDied","Data":"ebd2de122e6aaa17cd8a0c213b7152b98e0a155ee6742e140e493f8519a40869"} Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.710121 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" event={"ID":"2149a838-06ff-41e7-a655-2a6b51973caf","Type":"ContainerStarted","Data":"9ea26da54636f250ffe5ad2c4e40b5291df4f3e389a58a407ae35c70f9e86a82"} Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.711665 4961 generic.go:334] "Generic (PLEG): container finished" podID="4c4cb271-f0ee-4997-8e87-095f48d6658f" containerID="0b91837dedd88d57c7d0ce088c868d0bf5cc1fe1f034b17d4d31ae2a7e1c7fcf" exitCode=0 Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.711724 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" event={"ID":"4c4cb271-f0ee-4997-8e87-095f48d6658f","Type":"ContainerDied","Data":"0b91837dedd88d57c7d0ce088c868d0bf5cc1fe1f034b17d4d31ae2a7e1c7fcf"} Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.714406 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" event={"ID":"c72d062d-5afc-4b02-996e-0fd02b0ad9d1","Type":"ContainerStarted","Data":"a7e934e2d237dfd58042a334979ef9779a01c57d95b5be5b7d35e2d3ce970298"} Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.714442 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" event={"ID":"c72d062d-5afc-4b02-996e-0fd02b0ad9d1","Type":"ContainerStarted","Data":"b3cc60f37dcf5a5c209a198fb96b06c20883004b1f8ee04ec281d91320b0980c"} Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.718848 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.718900 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.733670 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-97rvg" podStartSLOduration=7.519189663 podStartE2EDuration="10.733654082s" podCreationTimestamp="2026-01-20 11:07:42 +0000 UTC" firstStartedPulling="2026-01-20 11:07:43.6613834 +0000 UTC m=+216.445883271" lastFinishedPulling="2026-01-20 11:07:46.875847819 +0000 UTC m=+219.660347690" observedRunningTime="2026-01-20 11:07:52.732548895 +0000 UTC m=+225.517048766" watchObservedRunningTime="2026-01-20 11:07:52.733654082 +0000 UTC m=+225.518153953" Jan 20 11:07:52 crc kubenswrapper[4961]: I0120 11:07:52.768101 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.373007 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.380161 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.473009 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509690 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles\") pod \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509750 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca\") pod \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509813 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h4vc\" (UniqueName: \"kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc\") pod \"4c4cb271-f0ee-4997-8e87-095f48d6658f\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509841 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config\") pod \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509860 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr9kc\" (UniqueName: \"kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc\") pod \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509892 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca\") pod \"4c4cb271-f0ee-4997-8e87-095f48d6658f\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509912 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config\") pod \"4c4cb271-f0ee-4997-8e87-095f48d6658f\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509945 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert\") pod \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\" (UID: \"b6aebb4a-5794-41ec-9084-2a0fc9103b58\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.509963 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert\") pod \"4c4cb271-f0ee-4997-8e87-095f48d6658f\" (UID: \"4c4cb271-f0ee-4997-8e87-095f48d6658f\") " Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.510513 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b6aebb4a-5794-41ec-9084-2a0fc9103b58" (UID: "b6aebb4a-5794-41ec-9084-2a0fc9103b58"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.510804 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c4cb271-f0ee-4997-8e87-095f48d6658f" (UID: "4c4cb271-f0ee-4997-8e87-095f48d6658f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.510968 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config" (OuterVolumeSpecName: "config") pod "b6aebb4a-5794-41ec-9084-2a0fc9103b58" (UID: "b6aebb4a-5794-41ec-9084-2a0fc9103b58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.511088 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6aebb4a-5794-41ec-9084-2a0fc9103b58" (UID: "b6aebb4a-5794-41ec-9084-2a0fc9103b58"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.511088 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config" (OuterVolumeSpecName: "config") pod "4c4cb271-f0ee-4997-8e87-095f48d6658f" (UID: "4c4cb271-f0ee-4997-8e87-095f48d6658f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.520434 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6aebb4a-5794-41ec-9084-2a0fc9103b58" (UID: "b6aebb4a-5794-41ec-9084-2a0fc9103b58"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.520489 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc" (OuterVolumeSpecName: "kube-api-access-2h4vc") pod "4c4cb271-f0ee-4997-8e87-095f48d6658f" (UID: "4c4cb271-f0ee-4997-8e87-095f48d6658f"). InnerVolumeSpecName "kube-api-access-2h4vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.520489 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c4cb271-f0ee-4997-8e87-095f48d6658f" (UID: "4c4cb271-f0ee-4997-8e87-095f48d6658f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.529253 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc" (OuterVolumeSpecName: "kube-api-access-zr9kc") pod "b6aebb4a-5794-41ec-9084-2a0fc9103b58" (UID: "b6aebb4a-5794-41ec-9084-2a0fc9103b58"). InnerVolumeSpecName "kube-api-access-zr9kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611680 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611731 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611744 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h4vc\" (UniqueName: \"kubernetes.io/projected/4c4cb271-f0ee-4997-8e87-095f48d6658f-kube-api-access-2h4vc\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611760 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aebb4a-5794-41ec-9084-2a0fc9103b58-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611772 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr9kc\" (UniqueName: \"kubernetes.io/projected/b6aebb4a-5794-41ec-9084-2a0fc9103b58-kube-api-access-zr9kc\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611783 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611793 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4cb271-f0ee-4997-8e87-095f48d6658f-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611804 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aebb4a-5794-41ec-9084-2a0fc9103b58-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.611814 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c4cb271-f0ee-4997-8e87-095f48d6658f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.723311 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" event={"ID":"b6aebb4a-5794-41ec-9084-2a0fc9103b58","Type":"ContainerDied","Data":"2bc1b0ec94cef7b8fc2a4e81fdaf2963ee0627e736f827170608247b6b1f34cf"} Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.723370 4961 scope.go:117] "RemoveContainer" containerID="ebd2de122e6aaa17cd8a0c213b7152b98e0a155ee6742e140e493f8519a40869" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.723406 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.726220 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" event={"ID":"2149a838-06ff-41e7-a655-2a6b51973caf","Type":"ContainerStarted","Data":"05211c0fa2b0c395b50dfb2e733f657def2f71fccaf8153a81a66210cdd2721a"} Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.726895 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.728620 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" event={"ID":"4c4cb271-f0ee-4997-8e87-095f48d6658f","Type":"ContainerDied","Data":"db03761a76c32557ff0c908dc3b9ac763fc9801dd71467bb95ba323dab2f27b4"} Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.729027 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.753649 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" podStartSLOduration=2.753628699 podStartE2EDuration="2.753628699s" podCreationTimestamp="2026-01-20 11:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:53.744792224 +0000 UTC m=+226.529292095" watchObservedRunningTime="2026-01-20 11:07:53.753628699 +0000 UTC m=+226.538128580" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.764244 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.773464 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-kz4d2"] Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.782677 4961 scope.go:117] "RemoveContainer" containerID="0b91837dedd88d57c7d0ce088c868d0bf5cc1fe1f034b17d4d31ae2a7e1c7fcf" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.787640 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" podStartSLOduration=2.787618498 podStartE2EDuration="2.787618498s" podCreationTimestamp="2026-01-20 11:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:53.768051411 +0000 UTC m=+226.552551302" watchObservedRunningTime="2026-01-20 11:07:53.787618498 +0000 UTC m=+226.572118369" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.790109 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-97rvg" Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.793448 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:53 crc kubenswrapper[4961]: I0120 11:07:53.797204 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-b24nr"] Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.361895 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-dc9lh"] Jan 20 11:07:54 crc kubenswrapper[4961]: E0120 11:07:54.362429 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4cb271-f0ee-4997-8e87-095f48d6658f" containerName="route-controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362444 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4cb271-f0ee-4997-8e87-095f48d6658f" containerName="route-controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: E0120 11:07:54.362459 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerName="controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362466 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerName="controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362550 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerName="controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362559 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4cb271-f0ee-4997-8e87-095f48d6658f" containerName="route-controller-manager" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362577 4961 patch_prober.go:28] interesting pod/controller-manager-6446dffbbd-kz4d2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362633 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6446dffbbd-kz4d2" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.362903 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.365435 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.365492 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.366214 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.366358 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.366562 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.366791 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.371760 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-dc9lh"] Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.378101 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.425102 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-config\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.425199 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tsr6\" (UniqueName: \"kubernetes.io/projected/606514f0-0665-408e-8f37-be9e7839c44d-kube-api-access-4tsr6\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.425272 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.425384 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606514f0-0665-408e-8f37-be9e7839c44d-serving-cert\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.425531 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-client-ca\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.526239 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-client-ca\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.526297 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-config\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.526378 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tsr6\" (UniqueName: \"kubernetes.io/projected/606514f0-0665-408e-8f37-be9e7839c44d-kube-api-access-4tsr6\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.526397 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606514f0-0665-408e-8f37-be9e7839c44d-serving-cert\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.526435 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.527423 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-client-ca\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.527689 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-config\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.527703 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/606514f0-0665-408e-8f37-be9e7839c44d-proxy-ca-bundles\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.531682 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/606514f0-0665-408e-8f37-be9e7839c44d-serving-cert\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.554759 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tsr6\" (UniqueName: \"kubernetes.io/projected/606514f0-0665-408e-8f37-be9e7839c44d-kube-api-access-4tsr6\") pod \"controller-manager-6446dffbbd-dc9lh\" (UID: \"606514f0-0665-408e-8f37-be9e7839c44d\") " pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.679345 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.739438 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtbnk" event={"ID":"d1882c51-8bab-473c-bd83-fe8427d32470","Type":"ContainerStarted","Data":"29adbf54d479b448cdb80464a03d7216fbe0babf2db5f9f322816ebde9a01083"} Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.762670 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qtbnk" podStartSLOduration=2.598727951 podStartE2EDuration="11.762652218s" podCreationTimestamp="2026-01-20 11:07:43 +0000 UTC" firstStartedPulling="2026-01-20 11:07:44.660075288 +0000 UTC m=+217.444575159" lastFinishedPulling="2026-01-20 11:07:53.823999555 +0000 UTC m=+226.608499426" observedRunningTime="2026-01-20 11:07:54.761732396 +0000 UTC m=+227.546232277" watchObservedRunningTime="2026-01-20 11:07:54.762652218 +0000 UTC m=+227.547152089" Jan 20 11:07:54 crc kubenswrapper[4961]: I0120 11:07:54.881905 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6446dffbbd-dc9lh"] Jan 20 11:07:54 crc kubenswrapper[4961]: W0120 11:07:54.897520 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod606514f0_0665_408e_8f37_be9e7839c44d.slice/crio-5ed0f454710d8d7609ec8c1e0bc673b0efa31f2536acef55da59faaf2782977d WatchSource:0}: Error finding container 5ed0f454710d8d7609ec8c1e0bc673b0efa31f2536acef55da59faaf2782977d: Status 404 returned error can't find the container with id 5ed0f454710d8d7609ec8c1e0bc673b0efa31f2536acef55da59faaf2782977d Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.548239 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4cb271-f0ee-4997-8e87-095f48d6658f" path="/var/lib/kubelet/pods/4c4cb271-f0ee-4997-8e87-095f48d6658f/volumes" Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.549159 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6aebb4a-5794-41ec-9084-2a0fc9103b58" path="/var/lib/kubelet/pods/b6aebb4a-5794-41ec-9084-2a0fc9103b58/volumes" Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.746825 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" event={"ID":"606514f0-0665-408e-8f37-be9e7839c44d","Type":"ContainerStarted","Data":"7dffc3ebe4660e5873387507cd6e559040ee145993e6db7463caec0522911db8"} Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.746870 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" event={"ID":"606514f0-0665-408e-8f37-be9e7839c44d","Type":"ContainerStarted","Data":"5ed0f454710d8d7609ec8c1e0bc673b0efa31f2536acef55da59faaf2782977d"} Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.747756 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.757727 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" Jan 20 11:07:55 crc kubenswrapper[4961]: I0120 11:07:55.799610 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6446dffbbd-dc9lh" podStartSLOduration=3.799594829 podStartE2EDuration="3.799594829s" podCreationTimestamp="2026-01-20 11:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:55.779286644 +0000 UTC m=+228.563786525" watchObservedRunningTime="2026-01-20 11:07:55.799594829 +0000 UTC m=+228.584094700" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.363930 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.364766 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.366650 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.367134 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.367344 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.367506 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.367723 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.369362 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.373979 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.458903 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmv4b\" (UniqueName: \"kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.458968 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.459037 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.459081 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.560652 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.560722 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.560781 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmv4b\" (UniqueName: \"kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.560818 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.562437 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.562661 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.567320 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.579444 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmv4b\" (UniqueName: \"kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b\") pod \"route-controller-manager-59b8b66648-kl7jn\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.681179 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:56 crc kubenswrapper[4961]: I0120 11:07:56.881172 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:07:56 crc kubenswrapper[4961]: W0120 11:07:56.889183 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54b24ba_eeae_4b13_baab_a6b2ab148b29.slice/crio-b1f8e7063cbd471dcf5018c3ee8435b33c8840271c3ddded61fe68b2beec1a48 WatchSource:0}: Error finding container b1f8e7063cbd471dcf5018c3ee8435b33c8840271c3ddded61fe68b2beec1a48: Status 404 returned error can't find the container with id b1f8e7063cbd471dcf5018c3ee8435b33c8840271c3ddded61fe68b2beec1a48 Jan 20 11:07:57 crc kubenswrapper[4961]: I0120 11:07:57.757465 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" event={"ID":"a54b24ba-eeae-4b13-baab-a6b2ab148b29","Type":"ContainerStarted","Data":"b1f8e7063cbd471dcf5018c3ee8435b33c8840271c3ddded61fe68b2beec1a48"} Jan 20 11:07:59 crc kubenswrapper[4961]: I0120 11:07:59.776165 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" event={"ID":"a54b24ba-eeae-4b13-baab-a6b2ab148b29","Type":"ContainerStarted","Data":"9744edad0ee9a72f77476cca4159248ca1924d3fc06baf5f5d2d11975223b0c3"} Jan 20 11:07:59 crc kubenswrapper[4961]: I0120 11:07:59.780312 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:59 crc kubenswrapper[4961]: I0120 11:07:59.786663 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:07:59 crc kubenswrapper[4961]: I0120 11:07:59.805526 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" podStartSLOduration=8.805502754 podStartE2EDuration="8.805502754s" podCreationTimestamp="2026-01-20 11:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:07:59.798960924 +0000 UTC m=+232.583460815" watchObservedRunningTime="2026-01-20 11:07:59.805502754 +0000 UTC m=+232.590002625" Jan 20 11:08:01 crc kubenswrapper[4961]: I0120 11:08:01.167702 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:08:01 crc kubenswrapper[4961]: I0120 11:08:01.205955 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6wb7h" Jan 20 11:08:01 crc kubenswrapper[4961]: I0120 11:08:01.579239 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:08:01 crc kubenswrapper[4961]: I0120 11:08:01.581714 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6wpc4" Jan 20 11:08:03 crc kubenswrapper[4961]: I0120 11:08:03.919612 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:08:03 crc kubenswrapper[4961]: I0120 11:08:03.919991 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:08:03 crc kubenswrapper[4961]: I0120 11:08:03.963047 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:08:04 crc kubenswrapper[4961]: I0120 11:08:04.846351 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qtbnk" Jan 20 11:08:11 crc kubenswrapper[4961]: I0120 11:08:11.936304 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fsp5c" Jan 20 11:08:11 crc kubenswrapper[4961]: I0120 11:08:11.987213 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:08:12 crc kubenswrapper[4961]: I0120 11:08:12.544101 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:08:12 crc kubenswrapper[4961]: I0120 11:08:12.544301 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" podUID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" containerName="route-controller-manager" containerID="cri-o://9744edad0ee9a72f77476cca4159248ca1924d3fc06baf5f5d2d11975223b0c3" gracePeriod=30 Jan 20 11:08:12 crc kubenswrapper[4961]: I0120 11:08:12.841141 4961 generic.go:334] "Generic (PLEG): container finished" podID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" containerID="9744edad0ee9a72f77476cca4159248ca1924d3fc06baf5f5d2d11975223b0c3" exitCode=0 Jan 20 11:08:12 crc kubenswrapper[4961]: I0120 11:08:12.841212 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" event={"ID":"a54b24ba-eeae-4b13-baab-a6b2ab148b29","Type":"ContainerDied","Data":"9744edad0ee9a72f77476cca4159248ca1924d3fc06baf5f5d2d11975223b0c3"} Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.612877 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.764906 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmv4b\" (UniqueName: \"kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b\") pod \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.765016 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert\") pod \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.765198 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca\") pod \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.765308 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config\") pod \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\" (UID: \"a54b24ba-eeae-4b13-baab-a6b2ab148b29\") " Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.766797 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config" (OuterVolumeSpecName: "config") pod "a54b24ba-eeae-4b13-baab-a6b2ab148b29" (UID: "a54b24ba-eeae-4b13-baab-a6b2ab148b29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.767864 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca" (OuterVolumeSpecName: "client-ca") pod "a54b24ba-eeae-4b13-baab-a6b2ab148b29" (UID: "a54b24ba-eeae-4b13-baab-a6b2ab148b29"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.776811 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a54b24ba-eeae-4b13-baab-a6b2ab148b29" (UID: "a54b24ba-eeae-4b13-baab-a6b2ab148b29"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.776989 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b" (OuterVolumeSpecName: "kube-api-access-pmv4b") pod "a54b24ba-eeae-4b13-baab-a6b2ab148b29" (UID: "a54b24ba-eeae-4b13-baab-a6b2ab148b29"). InnerVolumeSpecName "kube-api-access-pmv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.849000 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" event={"ID":"a54b24ba-eeae-4b13-baab-a6b2ab148b29","Type":"ContainerDied","Data":"b1f8e7063cbd471dcf5018c3ee8435b33c8840271c3ddded61fe68b2beec1a48"} Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.849086 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.849125 4961 scope.go:117] "RemoveContainer" containerID="9744edad0ee9a72f77476cca4159248ca1924d3fc06baf5f5d2d11975223b0c3" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.867642 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.867682 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a54b24ba-eeae-4b13-baab-a6b2ab148b29-config\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.867697 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmv4b\" (UniqueName: \"kubernetes.io/projected/a54b24ba-eeae-4b13-baab-a6b2ab148b29-kube-api-access-pmv4b\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.867712 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a54b24ba-eeae-4b13-baab-a6b2ab148b29-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.884423 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:08:13 crc kubenswrapper[4961]: I0120 11:08:13.889427 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b8b66648-kl7jn"] Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.382181 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2"] Jan 20 11:08:14 crc kubenswrapper[4961]: E0120 11:08:14.382408 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" containerName="route-controller-manager" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.382420 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" containerName="route-controller-manager" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.382528 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" containerName="route-controller-manager" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.382868 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.386261 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.386667 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.386862 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.386920 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.386869 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.387204 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.396665 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2"] Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.477481 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9641282c-ee5b-4f81-986e-6afef76a653d-serving-cert\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.477540 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgs5f\" (UniqueName: \"kubernetes.io/projected/9641282c-ee5b-4f81-986e-6afef76a653d-kube-api-access-fgs5f\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.477566 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-client-ca\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.477596 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-config\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.578676 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9641282c-ee5b-4f81-986e-6afef76a653d-serving-cert\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.578746 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgs5f\" (UniqueName: \"kubernetes.io/projected/9641282c-ee5b-4f81-986e-6afef76a653d-kube-api-access-fgs5f\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.578779 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-client-ca\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.578823 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-config\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.579843 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-client-ca\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.580110 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9641282c-ee5b-4f81-986e-6afef76a653d-config\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.585403 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9641282c-ee5b-4f81-986e-6afef76a653d-serving-cert\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.594394 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgs5f\" (UniqueName: \"kubernetes.io/projected/9641282c-ee5b-4f81-986e-6afef76a653d-kube-api-access-fgs5f\") pod \"route-controller-manager-6b586b96dd-zkbl2\" (UID: \"9641282c-ee5b-4f81-986e-6afef76a653d\") " pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:14 crc kubenswrapper[4961]: I0120 11:08:14.700247 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.166168 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2"] Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.547839 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54b24ba-eeae-4b13-baab-a6b2ab148b29" path="/var/lib/kubelet/pods/a54b24ba-eeae-4b13-baab-a6b2ab148b29/volumes" Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.863191 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" event={"ID":"9641282c-ee5b-4f81-986e-6afef76a653d","Type":"ContainerStarted","Data":"5f08a3b4a555fd7b450980681080b74030123896bb431758a179c2d8ad8c748c"} Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.863275 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" event={"ID":"9641282c-ee5b-4f81-986e-6afef76a653d","Type":"ContainerStarted","Data":"70aab939d1cac10207d8d3f13a6c3a9e0acabf057ef0cfb7a17ff10681cdeb8e"} Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.863520 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:15 crc kubenswrapper[4961]: I0120 11:08:15.883002 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" podStartSLOduration=3.882984282 podStartE2EDuration="3.882984282s" podCreationTimestamp="2026-01-20 11:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 11:08:15.878433421 +0000 UTC m=+248.662933292" watchObservedRunningTime="2026-01-20 11:08:15.882984282 +0000 UTC m=+248.667484153" Jan 20 11:08:16 crc kubenswrapper[4961]: I0120 11:08:16.196278 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b586b96dd-zkbl2" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.026796 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" podUID="2c38b7e7-a659-4038-aadf-b54948bfebf4" containerName="registry" containerID="cri-o://d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558" gracePeriod=30 Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.475891 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607321 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607368 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607390 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607411 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwcwd\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607432 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607476 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607495 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.607517 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets\") pod \"2c38b7e7-a659-4038-aadf-b54948bfebf4\" (UID: \"2c38b7e7-a659-4038-aadf-b54948bfebf4\") " Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.608870 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.609455 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.614392 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.614899 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd" (OuterVolumeSpecName: "kube-api-access-bwcwd") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "kube-api-access-bwcwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.626290 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.626285 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.627276 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.629851 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2c38b7e7-a659-4038-aadf-b54948bfebf4" (UID: "2c38b7e7-a659-4038-aadf-b54948bfebf4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708659 4961 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c38b7e7-a659-4038-aadf-b54948bfebf4-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708696 4961 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708706 4961 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708714 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwcwd\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-kube-api-access-bwcwd\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708722 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c38b7e7-a659-4038-aadf-b54948bfebf4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708730 4961 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c38b7e7-a659-4038-aadf-b54948bfebf4-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:37 crc kubenswrapper[4961]: I0120 11:08:37.708738 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c38b7e7-a659-4038-aadf-b54948bfebf4-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.008238 4961 generic.go:334] "Generic (PLEG): container finished" podID="2c38b7e7-a659-4038-aadf-b54948bfebf4" containerID="d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558" exitCode=0 Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.008288 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" event={"ID":"2c38b7e7-a659-4038-aadf-b54948bfebf4","Type":"ContainerDied","Data":"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558"} Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.008320 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" event={"ID":"2c38b7e7-a659-4038-aadf-b54948bfebf4","Type":"ContainerDied","Data":"5c3624bbeb6985b106759993e86bc53d1aa6d084bc0033d114f858da33c151ff"} Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.008337 4961 scope.go:117] "RemoveContainer" containerID="d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558" Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.008382 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-z54rs" Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.026900 4961 scope.go:117] "RemoveContainer" containerID="d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558" Jan 20 11:08:38 crc kubenswrapper[4961]: E0120 11:08:38.027550 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558\": container with ID starting with d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558 not found: ID does not exist" containerID="d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558" Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.027612 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558"} err="failed to get container status \"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558\": rpc error: code = NotFound desc = could not find container \"d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558\": container with ID starting with d6e2d22b18e96e1680f4e2749cb5e3ef3b0305e71197cb6ccaf76135d1a02558 not found: ID does not exist" Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.043470 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:08:38 crc kubenswrapper[4961]: I0120 11:08:38.047965 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-z54rs"] Jan 20 11:08:39 crc kubenswrapper[4961]: I0120 11:08:39.551751 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c38b7e7-a659-4038-aadf-b54948bfebf4" path="/var/lib/kubelet/pods/2c38b7e7-a659-4038-aadf-b54948bfebf4/volumes" Jan 20 11:09:07 crc kubenswrapper[4961]: I0120 11:09:07.373637 4961 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 20 11:09:31 crc kubenswrapper[4961]: I0120 11:09:31.723583 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:09:31 crc kubenswrapper[4961]: I0120 11:09:31.724391 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:10:01 crc kubenswrapper[4961]: I0120 11:10:01.723521 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:10:01 crc kubenswrapper[4961]: I0120 11:10:01.724427 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:10:31 crc kubenswrapper[4961]: I0120 11:10:31.723564 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:10:31 crc kubenswrapper[4961]: I0120 11:10:31.724595 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:10:31 crc kubenswrapper[4961]: I0120 11:10:31.724656 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:10:31 crc kubenswrapper[4961]: I0120 11:10:31.725859 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234"} pod="openshift-machine-config-operator/machine-config-daemon-48nk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 11:10:31 crc kubenswrapper[4961]: I0120 11:10:31.725961 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" containerID="cri-o://c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234" gracePeriod=600 Jan 20 11:10:32 crc kubenswrapper[4961]: I0120 11:10:32.704970 4961 generic.go:334] "Generic (PLEG): container finished" podID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerID="c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234" exitCode=0 Jan 20 11:10:32 crc kubenswrapper[4961]: I0120 11:10:32.705126 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerDied","Data":"c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234"} Jan 20 11:10:32 crc kubenswrapper[4961]: I0120 11:10:32.705329 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"b73e0101eed1e6fc412b2fecb75734e88760385242df02c051336e56fffd72cf"} Jan 20 11:10:32 crc kubenswrapper[4961]: I0120 11:10:32.705350 4961 scope.go:117] "RemoveContainer" containerID="6f067c6d9f779591467594bcff24d07919bbe280c82d7bc657faa215a6e63cdd" Jan 20 11:13:01 crc kubenswrapper[4961]: I0120 11:13:01.723816 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:13:01 crc kubenswrapper[4961]: I0120 11:13:01.726144 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:13:31 crc kubenswrapper[4961]: I0120 11:13:31.723493 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:13:31 crc kubenswrapper[4961]: I0120 11:13:31.724053 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.723361 4961 patch_prober.go:28] interesting pod/machine-config-daemon-48nk4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.724301 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.724346 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.724903 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b73e0101eed1e6fc412b2fecb75734e88760385242df02c051336e56fffd72cf"} pod="openshift-machine-config-operator/machine-config-daemon-48nk4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.724947 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" podUID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerName="machine-config-daemon" containerID="cri-o://b73e0101eed1e6fc412b2fecb75734e88760385242df02c051336e56fffd72cf" gracePeriod=600 Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.986789 4961 generic.go:334] "Generic (PLEG): container finished" podID="8a5754ab-8fe3-41b8-b760-b3d154e89ba8" containerID="b73e0101eed1e6fc412b2fecb75734e88760385242df02c051336e56fffd72cf" exitCode=0 Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.986870 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerDied","Data":"b73e0101eed1e6fc412b2fecb75734e88760385242df02c051336e56fffd72cf"} Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.987308 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-48nk4" event={"ID":"8a5754ab-8fe3-41b8-b760-b3d154e89ba8","Type":"ContainerStarted","Data":"134a5a659c0c24fb1a6d96bca69bf95253f687b51b90d707d22c944c7326611e"} Jan 20 11:14:01 crc kubenswrapper[4961]: I0120 11:14:01.987333 4961 scope.go:117] "RemoveContainer" containerID="c7a9e2302da973f7c15895195b1bab45843c930db322ad1902f673d137a19234"