Jan 22 09:03:28 crc systemd[1]: Starting Kubernetes Kubelet... Jan 22 09:03:28 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:28 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 22 09:03:29 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 22 09:03:29 crc kubenswrapper[4681]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.300183 4681 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302848 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302868 4681 feature_gate.go:330] unrecognized feature gate: Example Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302873 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302877 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302881 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302885 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302888 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302892 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302896 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302899 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302903 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302906 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302909 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302913 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302916 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302920 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302924 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302927 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302931 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302934 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302937 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302941 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302944 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302948 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302951 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302955 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302959 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302962 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302965 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302969 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302972 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302977 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302980 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302984 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302987 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302990 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302994 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.302999 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303003 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303007 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303011 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303014 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303018 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303021 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303024 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303028 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303031 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303035 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303039 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303042 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303045 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303049 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303052 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303056 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303059 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303064 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303068 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303073 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303077 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303080 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303086 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303090 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303094 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303098 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303101 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303104 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303109 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303118 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303122 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303126 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.303129 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303489 4681 flags.go:64] FLAG: --address="0.0.0.0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303501 4681 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303510 4681 flags.go:64] FLAG: --anonymous-auth="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303515 4681 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303520 4681 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303525 4681 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303530 4681 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303535 4681 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303539 4681 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303543 4681 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303548 4681 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303552 4681 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303556 4681 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303560 4681 flags.go:64] FLAG: --cgroup-root="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303564 4681 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303569 4681 flags.go:64] FLAG: --client-ca-file="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303573 4681 flags.go:64] FLAG: --cloud-config="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303578 4681 flags.go:64] FLAG: --cloud-provider="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303582 4681 flags.go:64] FLAG: --cluster-dns="[]" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303590 4681 flags.go:64] FLAG: --cluster-domain="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303594 4681 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303598 4681 flags.go:64] FLAG: --config-dir="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303602 4681 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303607 4681 flags.go:64] FLAG: --container-log-max-files="5" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303612 4681 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303616 4681 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303620 4681 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303624 4681 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303628 4681 flags.go:64] FLAG: --contention-profiling="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303632 4681 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303636 4681 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303640 4681 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303645 4681 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303650 4681 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303654 4681 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303658 4681 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303662 4681 flags.go:64] FLAG: --enable-load-reader="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303667 4681 flags.go:64] FLAG: --enable-server="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303671 4681 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303676 4681 flags.go:64] FLAG: --event-burst="100" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303680 4681 flags.go:64] FLAG: --event-qps="50" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303684 4681 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303688 4681 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303692 4681 flags.go:64] FLAG: --eviction-hard="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303697 4681 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303701 4681 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303705 4681 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303709 4681 flags.go:64] FLAG: --eviction-soft="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303713 4681 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303717 4681 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303721 4681 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303725 4681 flags.go:64] FLAG: --experimental-mounter-path="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303729 4681 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303733 4681 flags.go:64] FLAG: --fail-swap-on="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303737 4681 flags.go:64] FLAG: --feature-gates="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303742 4681 flags.go:64] FLAG: --file-check-frequency="20s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303746 4681 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303750 4681 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303754 4681 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303758 4681 flags.go:64] FLAG: --healthz-port="10248" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303762 4681 flags.go:64] FLAG: --help="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303766 4681 flags.go:64] FLAG: --hostname-override="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303770 4681 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303775 4681 flags.go:64] FLAG: --http-check-frequency="20s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303779 4681 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303782 4681 flags.go:64] FLAG: --image-credential-provider-config="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303786 4681 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303790 4681 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303795 4681 flags.go:64] FLAG: --image-service-endpoint="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303799 4681 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303803 4681 flags.go:64] FLAG: --kube-api-burst="100" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303807 4681 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303811 4681 flags.go:64] FLAG: --kube-api-qps="50" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303815 4681 flags.go:64] FLAG: --kube-reserved="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303819 4681 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303823 4681 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303827 4681 flags.go:64] FLAG: --kubelet-cgroups="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303831 4681 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303834 4681 flags.go:64] FLAG: --lock-file="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303838 4681 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303842 4681 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303846 4681 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303852 4681 flags.go:64] FLAG: --log-json-split-stream="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303856 4681 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303860 4681 flags.go:64] FLAG: --log-text-split-stream="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303864 4681 flags.go:64] FLAG: --logging-format="text" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303868 4681 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303872 4681 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303876 4681 flags.go:64] FLAG: --manifest-url="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303880 4681 flags.go:64] FLAG: --manifest-url-header="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303886 4681 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303890 4681 flags.go:64] FLAG: --max-open-files="1000000" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303895 4681 flags.go:64] FLAG: --max-pods="110" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303899 4681 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303904 4681 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303908 4681 flags.go:64] FLAG: --memory-manager-policy="None" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303912 4681 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303917 4681 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303920 4681 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303924 4681 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303934 4681 flags.go:64] FLAG: --node-status-max-images="50" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303938 4681 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303942 4681 flags.go:64] FLAG: --oom-score-adj="-999" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303946 4681 flags.go:64] FLAG: --pod-cidr="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303950 4681 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303957 4681 flags.go:64] FLAG: --pod-manifest-path="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303961 4681 flags.go:64] FLAG: --pod-max-pids="-1" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303965 4681 flags.go:64] FLAG: --pods-per-core="0" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303969 4681 flags.go:64] FLAG: --port="10250" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303972 4681 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303976 4681 flags.go:64] FLAG: --provider-id="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303980 4681 flags.go:64] FLAG: --qos-reserved="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303984 4681 flags.go:64] FLAG: --read-only-port="10255" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303988 4681 flags.go:64] FLAG: --register-node="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303992 4681 flags.go:64] FLAG: --register-schedulable="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.303996 4681 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304002 4681 flags.go:64] FLAG: --registry-burst="10" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304006 4681 flags.go:64] FLAG: --registry-qps="5" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304010 4681 flags.go:64] FLAG: --reserved-cpus="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304014 4681 flags.go:64] FLAG: --reserved-memory="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304019 4681 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304023 4681 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304027 4681 flags.go:64] FLAG: --rotate-certificates="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304031 4681 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304035 4681 flags.go:64] FLAG: --runonce="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304039 4681 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304043 4681 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304048 4681 flags.go:64] FLAG: --seccomp-default="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304052 4681 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304056 4681 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304060 4681 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304065 4681 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304069 4681 flags.go:64] FLAG: --storage-driver-password="root" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304073 4681 flags.go:64] FLAG: --storage-driver-secure="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304077 4681 flags.go:64] FLAG: --storage-driver-table="stats" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304081 4681 flags.go:64] FLAG: --storage-driver-user="root" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304085 4681 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304089 4681 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304093 4681 flags.go:64] FLAG: --system-cgroups="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304096 4681 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304104 4681 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304107 4681 flags.go:64] FLAG: --tls-cert-file="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304111 4681 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304116 4681 flags.go:64] FLAG: --tls-min-version="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304120 4681 flags.go:64] FLAG: --tls-private-key-file="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304124 4681 flags.go:64] FLAG: --topology-manager-policy="none" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304128 4681 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304131 4681 flags.go:64] FLAG: --topology-manager-scope="container" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304136 4681 flags.go:64] FLAG: --v="2" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304141 4681 flags.go:64] FLAG: --version="false" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304146 4681 flags.go:64] FLAG: --vmodule="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304199 4681 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304204 4681 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304329 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304335 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304339 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304342 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304347 4681 feature_gate.go:330] unrecognized feature gate: Example Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304351 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304355 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304358 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304365 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304370 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304374 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304378 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304382 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304386 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304389 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304393 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304397 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304401 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304404 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304408 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304411 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304415 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304418 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304425 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304430 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304434 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304438 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304442 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304446 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304451 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304455 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304459 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304462 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304466 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304470 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304474 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304478 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304481 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304485 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304488 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304493 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304496 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304500 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304503 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304507 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304511 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304514 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304517 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304521 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304524 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304527 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304531 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304534 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304538 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304541 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304544 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304547 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304552 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304556 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304559 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304563 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304566 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304570 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304573 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304577 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304580 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304584 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304588 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304591 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304595 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.304598 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.304609 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.311590 4681 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.311834 4681 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311887 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311904 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311910 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311918 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311925 4681 feature_gate.go:330] unrecognized feature gate: Example Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311930 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311935 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311950 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311955 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311960 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311965 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311969 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311973 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311977 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311981 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311986 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311990 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311994 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.311998 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312002 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312007 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312012 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312017 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312021 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312024 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312028 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312031 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312035 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312038 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312042 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312045 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312049 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312054 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312060 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312071 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312079 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312084 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312089 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312094 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312098 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312104 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312108 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312113 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312117 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312121 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312125 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312129 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312134 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312138 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312142 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312147 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312151 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312155 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312159 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312162 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312167 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312173 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312176 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312181 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312185 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312189 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312192 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312196 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312201 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312206 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312210 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312215 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312219 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312223 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312228 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312233 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.312240 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312445 4681 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312456 4681 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312461 4681 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312465 4681 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312469 4681 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312473 4681 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312476 4681 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312480 4681 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312483 4681 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312487 4681 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312490 4681 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312494 4681 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312498 4681 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312501 4681 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312505 4681 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312508 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312512 4681 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312515 4681 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312519 4681 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312522 4681 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312526 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312530 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312533 4681 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312537 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312540 4681 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312544 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312547 4681 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312551 4681 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312554 4681 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312558 4681 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312561 4681 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312565 4681 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312568 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312571 4681 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312576 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312579 4681 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312582 4681 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312586 4681 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312589 4681 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312593 4681 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312596 4681 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312600 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312603 4681 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312607 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312610 4681 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312614 4681 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312618 4681 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312623 4681 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312627 4681 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312631 4681 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312635 4681 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312639 4681 feature_gate.go:330] unrecognized feature gate: Example Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312643 4681 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312647 4681 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312651 4681 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312656 4681 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312660 4681 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312664 4681 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312668 4681 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312672 4681 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312676 4681 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312680 4681 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312685 4681 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312688 4681 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312692 4681 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312696 4681 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312700 4681 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312704 4681 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312707 4681 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312711 4681 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.312715 4681 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.312722 4681 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.312865 4681 server.go:940] "Client rotation is on, will bootstrap in background" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.315553 4681 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.315628 4681 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.316493 4681 server.go:997] "Starting client certificate rotation" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.316513 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.316754 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-17 21:10:07.179983061 +0000 UTC Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.316910 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.323283 4681 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.326646 4681 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.327496 4681 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.336534 4681 log.go:25] "Validated CRI v1 runtime API" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.352658 4681 log.go:25] "Validated CRI v1 image API" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.354691 4681 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.358521 4681 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-22-08-59-06-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.358553 4681 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.371637 4681 manager.go:217] Machine: {Timestamp:2026-01-22 09:03:29.370547125 +0000 UTC m=+0.196457650 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9e6f37b2-9cc8-489a-8270-7febca8a276e BootID:eaa07520-342b-43b3-b7a3-09686a27fb31 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:10:d5:b4 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:10:d5:b4 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:63:96:ed Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1f:9a:4e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a7:ef:8b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c9:5f:58 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:17:70:bf:c0:08 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:50:59:51:c6:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.371856 4681 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.371983 4681 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.372587 4681 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.372744 4681 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.372781 4681 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.372976 4681 topology_manager.go:138] "Creating topology manager with none policy" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.372987 4681 container_manager_linux.go:303] "Creating device plugin manager" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.373221 4681 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.373250 4681 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.373463 4681 state_mem.go:36] "Initialized new in-memory state store" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.373546 4681 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.374325 4681 kubelet.go:418] "Attempting to sync node with API server" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.374352 4681 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.374379 4681 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.374393 4681 kubelet.go:324] "Adding apiserver pod source" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.374405 4681 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.376307 4681 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.377084 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.377354 4681 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.377860 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.378489 4681 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.378724 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.379104 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379132 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379198 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379218 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379231 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379249 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379291 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379306 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379325 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379339 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379349 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379412 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.379425 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.380306 4681 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.381288 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.382290 4681 server.go:1280] "Started kubelet" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.382868 4681 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.382870 4681 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.384195 4681 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.385349 4681 server.go:460] "Adding debug handlers to kubelet server" Jan 22 09:03:29 crc systemd[1]: Started Kubernetes Kubelet. Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.386886 4681 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188d022a501a202f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 09:03:29.382244399 +0000 UTC m=+0.208154924,LastTimestamp:2026-01-22 09:03:29.382244399 +0000 UTC m=+0.208154924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.388535 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.388618 4681 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.388681 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:27:43.279618023 +0000 UTC Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.388886 4681 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.388999 4681 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.388898 4681 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.389874 4681 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.389912 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.389843 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.390668 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.390305 4681 factory.go:55] Registering systemd factory Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.390724 4681 factory.go:221] Registration of the systemd container factory successfully Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.393318 4681 factory.go:153] Registering CRI-O factory Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.393351 4681 factory.go:221] Registration of the crio container factory successfully Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.393431 4681 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.393454 4681 factory.go:103] Registering Raw factory Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.393471 4681 manager.go:1196] Started watching for new ooms in manager Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.394062 4681 manager.go:319] Starting recovery of all containers Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398441 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398503 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398518 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398538 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398547 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398561 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398571 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398581 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398598 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398607 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398621 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398632 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398652 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398667 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398682 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398696 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398711 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398729 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398744 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398760 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398770 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398780 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398794 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398805 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398820 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398830 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398847 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398859 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398876 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398886 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398902 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398919 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398934 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398946 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398960 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398973 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398984 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.398995 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399014 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399029 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399046 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399059 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399071 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399090 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399103 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399143 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399153 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399164 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399177 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399221 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399237 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399248 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399292 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399310 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399331 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399351 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399363 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399378 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399393 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399407 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399417 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399427 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399439 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399453 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399468 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399485 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399500 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399519 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399536 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399551 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399569 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399583 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399603 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399616 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399630 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399646 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399667 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399688 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399702 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399775 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399792 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399804 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399822 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399844 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399857 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399876 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399890 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399903 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399924 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.399933 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400730 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400830 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400873 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400900 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400930 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400964 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.400993 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401027 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401057 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401080 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401113 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401137 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401172 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401197 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401284 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401320 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401345 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401371 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401392 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401409 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.401436 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.403640 4681 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.403783 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.403842 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.403890 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.403922 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404009 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404055 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404094 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404135 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404165 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404273 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404340 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404377 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.404432 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.407880 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.407924 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.407966 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.407982 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408001 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408016 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408030 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408045 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408059 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408073 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408087 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408101 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408116 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408130 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408144 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408159 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408174 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408188 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408201 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408214 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408229 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408243 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408256 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408284 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408297 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408311 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408325 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408338 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408352 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408366 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408378 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408392 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408407 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408420 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408432 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408447 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408462 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408476 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408488 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408502 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408515 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408529 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408541 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408557 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408570 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408582 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408595 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408608 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408621 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408634 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408648 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408662 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408710 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408724 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408737 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408751 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408789 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408807 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408821 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408839 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408852 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408871 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408883 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408901 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408914 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408930 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408944 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408957 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408975 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.408987 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409000 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409012 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409025 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409038 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409049 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409062 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409074 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409088 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409101 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409113 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409126 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409139 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409151 4681 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409163 4681 reconstruct.go:97] "Volume reconstruction finished" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.409172 4681 reconciler.go:26] "Reconciler: start to sync state" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.434730 4681 manager.go:324] Recovery completed Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.446781 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.448538 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.448579 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.448595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.448675 4681 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.449451 4681 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.449480 4681 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.449503 4681 state_mem.go:36] "Initialized new in-memory state store" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.451128 4681 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.451187 4681 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.451226 4681 kubelet.go:2335] "Starting kubelet main sync loop" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.451398 4681 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.452744 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.452952 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.463910 4681 policy_none.go:49] "None policy: Start" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.465947 4681 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.465984 4681 state_mem.go:35] "Initializing new in-memory state store" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.489432 4681 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.517761 4681 manager.go:334] "Starting Device Plugin manager" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.517815 4681 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.517829 4681 server.go:79] "Starting device plugin registration server" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.518315 4681 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.518334 4681 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.518576 4681 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.518670 4681 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.518679 4681 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.532778 4681 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.552217 4681 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.552346 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.554452 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.554524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.554556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.554777 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.555016 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.555100 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.555827 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.555870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.555883 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556024 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556184 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556211 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556661 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.556675 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557042 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557093 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557103 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557124 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557166 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557192 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557213 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557362 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.557397 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560379 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560427 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560445 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560524 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560545 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560557 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.560811 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.561229 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.561283 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562387 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562441 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562484 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562811 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562835 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.562870 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.563227 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.563303 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.565612 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.565653 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.565667 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.591637 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.611735 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.611972 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612077 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612232 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612376 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612482 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612583 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612686 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612803 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.612915 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.613020 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.613124 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.613228 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.613387 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.613497 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.619008 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.620785 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.620837 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.620856 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.620894 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.621642 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.714962 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715034 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715059 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715111 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715139 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715162 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715188 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715211 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715232 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715252 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715301 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715324 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715342 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715361 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715380 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715684 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715818 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715824 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715827 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715848 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715893 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715900 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715920 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715942 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.715925 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.716019 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.716082 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.716031 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.716336 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.822878 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.824537 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.824613 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.824632 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.824682 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.825306 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.903400 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.921330 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.926534 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d916a8f20daff4b340ca29cde2226a6d8e52252749b45df94e7b248f0d48455d WatchSource:0}: Error finding container d916a8f20daff4b340ca29cde2226a6d8e52252749b45df94e7b248f0d48455d: Status 404 returned error can't find the container with id d916a8f20daff4b340ca29cde2226a6d8e52252749b45df94e7b248f0d48455d Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.946290 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4c7ad519ef7debd0befe50b931ec79e91763006e03c949a80bccd59268d2088b WatchSource:0}: Error finding container 4c7ad519ef7debd0befe50b931ec79e91763006e03c949a80bccd59268d2088b: Status 404 returned error can't find the container with id 4c7ad519ef7debd0befe50b931ec79e91763006e03c949a80bccd59268d2088b Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.948702 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.954679 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: I0122 09:03:29.959674 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.963982 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2f08d2916e4a4d82a17898f8ed8ee45acf9883280782ba4353fad88c095cc123 WatchSource:0}: Error finding container 2f08d2916e4a4d82a17898f8ed8ee45acf9883280782ba4353fad88c095cc123: Status 404 returned error can't find the container with id 2f08d2916e4a4d82a17898f8ed8ee45acf9883280782ba4353fad88c095cc123 Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.978867 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-392a759271b746f8be4dd580e7cb7a5604c8cd23e0bbcbda4bf54a634d732696 WatchSource:0}: Error finding container 392a759271b746f8be4dd580e7cb7a5604c8cd23e0bbcbda4bf54a634d732696: Status 404 returned error can't find the container with id 392a759271b746f8be4dd580e7cb7a5604c8cd23e0bbcbda4bf54a634d732696 Jan 22 09:03:29 crc kubenswrapper[4681]: W0122 09:03:29.981336 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a474743ffb289e1473fd2e501947c3ed0db81e347b3db9d6768ca84dfe22c334 WatchSource:0}: Error finding container a474743ffb289e1473fd2e501947c3ed0db81e347b3db9d6768ca84dfe22c334: Status 404 returned error can't find the container with id a474743ffb289e1473fd2e501947c3ed0db81e347b3db9d6768ca84dfe22c334 Jan 22 09:03:29 crc kubenswrapper[4681]: E0122 09:03:29.993051 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.226374 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.228832 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.228893 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.228910 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.228942 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.229357 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.382497 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.389669 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:40:44.872979062 +0000 UTC Jan 22 09:03:30 crc kubenswrapper[4681]: W0122 09:03:30.413848 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.413985 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.458417 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867" exitCode=0 Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.458507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.458708 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"392a759271b746f8be4dd580e7cb7a5604c8cd23e0bbcbda4bf54a634d732696"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.458962 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460347 4681 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d492b308420e7bfbcfe877e1c23d18e38154adabc0693394ef69a1e56eafa481" exitCode=0 Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460426 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d492b308420e7bfbcfe877e1c23d18e38154adabc0693394ef69a1e56eafa481"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460464 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2f08d2916e4a4d82a17898f8ed8ee45acf9883280782ba4353fad88c095cc123"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460575 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460617 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460653 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.460663 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.461539 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.461568 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.461578 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.462165 4681 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="241deddbc1ddb7c0b64c98f1a78649428dc35b7b23ed70b7cc82703393e94e48" exitCode=0 Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.462232 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"241deddbc1ddb7c0b64c98f1a78649428dc35b7b23ed70b7cc82703393e94e48"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.462318 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c7ad519ef7debd0befe50b931ec79e91763006e03c949a80bccd59268d2088b"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.462420 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.463649 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.463671 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.463681 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.463723 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"91b42d0d981cb9e194bf8ad0a8b945e348cde3f9ed52985d54c501b1768d7f8f"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.463751 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d916a8f20daff4b340ca29cde2226a6d8e52252749b45df94e7b248f0d48455d"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.465480 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad" exitCode=0 Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.465519 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.465537 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a474743ffb289e1473fd2e501947c3ed0db81e347b3db9d6768ca84dfe22c334"} Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.465635 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.466473 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.466505 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.466517 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.470061 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.471087 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.471127 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:30 crc kubenswrapper[4681]: I0122 09:03:30.471143 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:30 crc kubenswrapper[4681]: W0122 09:03:30.600535 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.600618 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:30 crc kubenswrapper[4681]: W0122 09:03:30.712895 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.712984 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.795027 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Jan 22 09:03:30 crc kubenswrapper[4681]: W0122 09:03:30.951955 4681 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.130:6443: connect: connection refused Jan 22 09:03:30 crc kubenswrapper[4681]: E0122 09:03:30.952082 4681 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.130:6443: connect: connection refused" logger="UnhandledError" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.029938 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.031557 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.031603 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.031618 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.031649 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:31 crc kubenswrapper[4681]: E0122 09:03:31.032403 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.130:6443: connect: connection refused" node="crc" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.390622 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 13:20:32.027160508 +0000 UTC Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.476966 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b" exitCode=0 Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.477063 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.477283 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.478280 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.478318 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.478330 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.479943 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"09b6b3238451b5fd203fccef93e5c2d46598fadd04f45c8bc235bacef2488275"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.480069 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.482410 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.482435 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.482448 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.485033 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.490755 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8cf8a6c324e87acd1e461d930235b81ef8858fb3e986ce3c6726bb61f2532099"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.490940 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e52c7999d1b582700b106962d1a82a38adfe0ea9f07fa479ccc577318625dd28"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.491067 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8bd03ea5120ebfbfad179467e620e774bf7ef8c9a0804cc04eb5090d3696fcee"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.490837 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.492240 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.492374 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.492470 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.495158 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c980d198d01212418816af80bff402b9851a466d7036e847a78038d06d92dd0"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.495303 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d3ffd8edb629f15c010f0b2859c4f85a43e5758f5a190aeb3b680ef0fe1200f0"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.495397 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6aef33d5e1344934edcd521a1fff74f31b6555857b22cbff3bfee112ef646dbd"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.495171 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.496478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.496573 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.496740 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.501754 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.501882 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.501969 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c"} Jan 22 09:03:31 crc kubenswrapper[4681]: I0122 09:03:31.502043 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598"} Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.390739 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 22:49:28.28611938 +0000 UTC Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.507321 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf"} Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.507381 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508107 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508130 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508140 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508878 4681 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25" exitCode=0 Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508941 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25"} Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508989 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.508951 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.509038 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.509063 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.509581 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.509608 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.509619 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510302 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510324 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510335 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510326 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510420 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.510433 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.632589 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.633950 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.634009 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.634025 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:32 crc kubenswrapper[4681]: I0122 09:03:32.634076 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.204698 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.391363 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 12:57:19.455777646 +0000 UTC Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516665 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae238a6183c850f6bdfbf3a1885d47fe125dc23b63e32bd1d4aa487ed2805c4c"} Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516709 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db562e61f638140d3cf1ae6afccb6493b62767075989f5b1828b98878971a75f"} Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516721 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5391d29a26abd65f41dc6afe945e5f63e73dc09a6bc8d63adbce8f1c8ee151e"} Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516733 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"741166c92b268e9cf7b7663081907db064e921b622600bfd1a7135a8e7639ec5"} Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516806 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.516866 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.518221 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.518359 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:33 crc kubenswrapper[4681]: I0122 09:03:33.518397 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.392209 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:28:54.568036514 +0000 UTC Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.523478 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d8702dec0514120064dbfe60b959207918ee945ecdc1280aea4e8603ab9e1cd4"} Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.523562 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.523636 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527460 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527501 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527513 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527703 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527755 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:34 crc kubenswrapper[4681]: I0122 09:03:34.527777 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.391503 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.391690 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.392449 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:12:45.227533943 +0000 UTC Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.392892 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.392965 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.392979 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.525718 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.526689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.526754 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:35 crc kubenswrapper[4681]: I0122 09:03:35.526773 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:36 crc kubenswrapper[4681]: I0122 09:03:36.392621 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:32:38.165741088 +0000 UTC Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.190570 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.190782 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.192737 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.192795 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.192809 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:37 crc kubenswrapper[4681]: I0122 09:03:37.393515 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:59:00.203608005 +0000 UTC Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.103961 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.104312 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.105897 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.105948 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.105958 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.122750 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.123002 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.124514 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.124572 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.124588 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.130070 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.393669 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:36:27.813307956 +0000 UTC Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.533663 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.533854 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.534807 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.534867 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.534891 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.634142 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.634508 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.636646 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.636702 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:38 crc kubenswrapper[4681]: I0122 09:03:38.636714 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.394690 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 11:04:05.561959006 +0000 UTC Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.501997 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:39 crc kubenswrapper[4681]: E0122 09:03:39.533967 4681 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.536148 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.537482 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.537535 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.537546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.547933 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.548215 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.549556 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.549595 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:39 crc kubenswrapper[4681]: I0122 09:03:39.549605 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.394894 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:32:29.631966699 +0000 UTC Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.539499 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.540611 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.540663 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.540680 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:40 crc kubenswrapper[4681]: I0122 09:03:40.550050 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.088149 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.088245 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.383809 4681 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.395084 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:21:42.462265103 +0000 UTC Jan 22 09:03:41 crc kubenswrapper[4681]: E0122 09:03:41.486718 4681 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.542109 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.543111 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.543145 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:41 crc kubenswrapper[4681]: I0122 09:03:41.543155 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.396089 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 13:33:45.013551508 +0000 UTC Jan 22 09:03:42 crc kubenswrapper[4681]: E0122 09:03:42.396315 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.502423 4681 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.502513 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 22 09:03:42 crc kubenswrapper[4681]: E0122 09:03:42.635316 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.680971 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.681046 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.689374 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]log ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]etcd ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/generic-apiserver-start-informers ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/priority-and-fairness-filter ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-apiextensions-informers ok Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/crd-informer-synced failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-system-namespaces-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/bootstrap-controller failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/start-kube-aggregator-informers ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 22 09:03:42 crc kubenswrapper[4681]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]autoregister-completion ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/apiservice-openapi-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 22 09:03:42 crc kubenswrapper[4681]: livez check failed Jan 22 09:03:42 crc kubenswrapper[4681]: I0122 09:03:42.689455 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:03:43 crc kubenswrapper[4681]: I0122 09:03:43.396538 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 02:01:39.301446638 +0000 UTC Jan 22 09:03:44 crc kubenswrapper[4681]: I0122 09:03:44.397587 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 21:12:29.15071846 +0000 UTC Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.398576 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 01:34:41.890451602 +0000 UTC Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.775633 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.790014 4681 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.835950 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.840102 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.840180 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.840195 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:45 crc kubenswrapper[4681]: I0122 09:03:45.840228 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:45 crc kubenswrapper[4681]: E0122 09:03:45.845808 4681 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 22 09:03:46 crc kubenswrapper[4681]: I0122 09:03:46.399334 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 21:01:43.483818744 +0000 UTC Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.199505 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.199769 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.201382 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.201452 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.201475 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.207415 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.399879 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 01:24:38.683928596 +0000 UTC Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.565248 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.566608 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.566658 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.566671 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.665469 4681 trace.go:236] Trace[447677324]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 09:03:33.175) (total time: 14489ms): Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[447677324]: ---"Objects listed" error: 14489ms (09:03:47.665) Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[447677324]: [14.489326419s] [14.489326419s] END Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.665536 4681 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.665700 4681 trace.go:236] Trace[847161464]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 09:03:33.641) (total time: 14024ms): Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[847161464]: ---"Objects listed" error: 14024ms (09:03:47.665) Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[847161464]: [14.024145432s] [14.024145432s] END Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.665734 4681 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.666225 4681 trace.go:236] Trace[1279419522]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 09:03:32.701) (total time: 14964ms): Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[1279419522]: ---"Objects listed" error: 14964ms (09:03:47.666) Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[1279419522]: [14.964882536s] [14.964882536s] END Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.666254 4681 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.667658 4681 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.669399 4681 trace.go:236] Trace[1402582038]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Jan-2026 09:03:33.378) (total time: 14290ms): Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[1402582038]: ---"Objects listed" error: 14290ms (09:03:47.669) Jan 22 09:03:47 crc kubenswrapper[4681]: Trace[1402582038]: [14.29048849s] [14.29048849s] END Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.669439 4681 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.764635 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45824->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.764678 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45826->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.764719 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45824->192.168.126.11:17697: read: connection reset by peer" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.764743 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45826->192.168.126.11:17697: read: connection reset by peer" Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.765172 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 22 09:03:47 crc kubenswrapper[4681]: I0122 09:03:47.765283 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.139219 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.155004 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.385307 4681 apiserver.go:52] "Watching apiserver" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.389652 4681 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.390029 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc"] Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.390480 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.390580 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.390480 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.390648 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.390616 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.390828 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.391693 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.391974 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.392233 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.395899 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396069 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396124 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396120 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396176 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396208 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396081 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396314 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.396070 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.400038 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:07:48.497371524 +0000 UTC Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.433652 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.448888 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.467403 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.482024 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.490720 4681 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.495465 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.514821 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ba76b-1f78-4242-97ac-3bd3faa13b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5391d29a26abd65f41dc6afe945e5f63e73dc09a6bc8d63adbce8f1c8ee151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db562e61f638140d3cf1ae6afccb6493b62767075989f5b1828b98878971a75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae238a6183c850f6bdfbf3a1885d47fe125dc23b63e32bd1d4aa487ed2805c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8702dec0514120064dbfe60b959207918ee945ecdc1280aea4e8603ab9e1cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://741166c92b268e9cf7b7663081907db064e921b622600bfd1a7135a8e7639ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.527593 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.539406 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.549917 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.569273 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.571601 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572399 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572435 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572456 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572475 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572490 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572508 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572529 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572544 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572561 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572581 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572601 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572618 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572658 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572674 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572691 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572710 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572727 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572748 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572767 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572783 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572801 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572818 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572836 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572854 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572873 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572862 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572895 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572914 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572933 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572948 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572967 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572983 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572993 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.572990 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573001 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573108 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573133 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573202 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573242 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573308 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573322 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573347 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573394 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573389 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573428 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573460 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573487 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573512 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573536 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573561 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573584 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573605 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573629 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573650 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573673 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573699 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573723 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573755 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573778 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573800 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573819 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573842 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573863 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573889 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573905 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573927 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573949 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573957 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573973 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.573999 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574006 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574026 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574049 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574073 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574096 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574117 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574167 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574189 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574222 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574232 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574236 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574274 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574253 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574278 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574299 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574324 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574347 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574370 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574393 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574414 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574440 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574444 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574462 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574469 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574486 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574509 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574532 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574554 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574576 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574640 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574664 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574710 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574714 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574725 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574732 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574790 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574799 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574828 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574856 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574883 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574915 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574944 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574973 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.574993 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575000 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575037 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575071 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575078 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575091 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575113 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575161 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575187 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575229 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575250 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575299 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575318 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575337 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575357 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575816 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575840 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575859 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575876 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575892 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575927 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575944 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575960 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575977 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575997 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576014 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576032 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576049 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576087 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576110 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576128 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576145 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576162 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576177 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576194 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576212 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576249 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576288 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577190 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577303 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577349 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577397 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577438 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577479 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577601 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577650 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577693 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577735 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577769 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577811 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577856 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577892 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577942 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577990 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578031 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578070 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578104 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578140 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578181 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578217 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578253 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578324 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578360 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578404 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578450 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578489 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578526 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578570 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578605 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578644 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578683 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578740 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578777 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578814 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578851 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578891 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578935 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578974 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579015 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579060 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579097 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579134 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579170 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579209 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579248 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579316 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579355 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579392 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579437 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579476 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579676 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579718 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579756 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579798 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579839 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579878 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579915 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579958 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579995 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580029 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580068 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580149 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580200 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580243 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580311 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580353 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580414 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580452 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580495 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580537 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580578 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580621 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580661 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580707 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580746 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580896 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580926 4681 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580949 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580972 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580993 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581014 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581038 4681 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581060 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581082 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581105 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581125 4681 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581147 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581168 4681 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581196 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581221 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581242 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581290 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581311 4681 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581332 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581353 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581373 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581394 4681 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575255 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575560 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575766 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575783 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575815 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.575862 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576042 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576186 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576237 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576373 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576438 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576526 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576603 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576706 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576903 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.576926 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577085 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.577971 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578468 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578506 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578818 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578856 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.578915 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579131 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579191 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579230 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579251 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579381 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579419 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579488 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579524 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579690 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579789 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579806 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.579925 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580027 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580362 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580360 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580685 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.580867 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581132 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581671 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.581752 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.582250 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.582367 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.582628 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.582788 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583022 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583179 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583239 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.582865 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583378 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583409 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583420 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583644 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.583665 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.586974 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.587234 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.587481 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.587627 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.587638 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.587998 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.588037 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.588113 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.588144 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.588925 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.589287 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.589143 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590007 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590143 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590311 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590369 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.590532 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590557 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590544 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf" exitCode=255 Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590755 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593763 4681 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593971 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.595169 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.596321 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf"} Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590588 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590594 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.590610 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:49.090587238 +0000 UTC m=+19.916497743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590838 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590918 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.590947 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.591379 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.591965 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.591988 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.592193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.592334 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.592752 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.592769 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593045 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593192 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593288 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593625 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593690 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593790 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.593844 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.594134 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.594229 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.594880 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.595084 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.595445 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.596466 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.597143 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.597296 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:49.097252424 +0000 UTC m=+19.923162929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.597511 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:03:49.09750154 +0000 UTC m=+19.923412035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.597812 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.597923 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.597978 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.598075 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.598288 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.598539 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.599772 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.599847 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.600507 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.602122 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.602207 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.602418 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.602865 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.611027 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.611588 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.613204 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.614135 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.614159 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.614171 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.614228 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:49.114212502 +0000 UTC m=+19.940123007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.614703 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.614987 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.615690 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.617935 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.618056 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.618200 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:49.118177457 +0000 UTC m=+19.944087972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.617245 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.617835 4681 scope.go:117] "RemoveContainer" containerID="281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf" Jan 22 09:03:48 crc kubenswrapper[4681]: E0122 09:03:48.617991 4681 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.621785 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.622103 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.623034 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.623053 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.624023 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.624311 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.625158 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.625442 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.625941 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.626125 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.626130 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.626428 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.626459 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.626472 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.627013 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.628342 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.627749 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.628049 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.628217 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.628464 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.628908 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629101 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629168 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629368 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629522 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629601 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.629960 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.630156 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.630426 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.631080 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.631844 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.633242 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.633389 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.634502 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.635771 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.635993 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.637550 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.640576 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.640745 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.641193 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.641630 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.642624 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.642851 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.642903 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.643400 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.644633 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.645484 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.646216 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.646354 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.646608 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.647568 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.647722 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.648044 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.648479 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.648704 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.649727 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.654315 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.655057 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.655966 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.659911 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.664732 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.673106 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.678720 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.681833 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.681915 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.681992 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682018 4681 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682034 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682051 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682066 4681 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682080 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682093 4681 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682106 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682102 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682138 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682121 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682302 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682320 4681 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682330 4681 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682344 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682377 4681 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682388 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682398 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682409 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682417 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682428 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682459 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682468 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682480 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682489 4681 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682498 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682507 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682536 4681 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682546 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682556 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682564 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682572 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682581 4681 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682608 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682620 4681 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682629 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682638 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682647 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682659 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682668 4681 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682697 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682705 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682715 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682724 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682733 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682744 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682772 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682781 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682790 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682799 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682807 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682816 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682824 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682853 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682863 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682871 4681 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682879 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682887 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682895 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682903 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682933 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682943 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682952 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682962 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682972 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.682982 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683012 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683022 4681 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683030 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683041 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683050 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683058 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683100 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683108 4681 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683116 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683124 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683133 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683141 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683170 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683180 4681 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683188 4681 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683199 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683207 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683216 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683225 4681 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683254 4681 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683276 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683284 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683292 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683301 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683332 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683342 4681 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683350 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683360 4681 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683369 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683377 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683385 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683413 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683423 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683432 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683441 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683451 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683459 4681 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683469 4681 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683499 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683507 4681 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683516 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683525 4681 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683534 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683543 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683574 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683585 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683594 4681 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683603 4681 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683612 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683621 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683649 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683660 4681 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683668 4681 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683680 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683689 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683697 4681 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683705 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683735 4681 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683745 4681 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683753 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683763 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683772 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683781 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683808 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683819 4681 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683827 4681 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683836 4681 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683845 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683856 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683865 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683893 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683903 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683912 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683920 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683929 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683938 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683967 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683977 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683985 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.683996 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684006 4681 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684014 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684023 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684059 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684070 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684079 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684090 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684098 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684127 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684138 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684151 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684163 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684174 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684184 4681 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684212 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684221 4681 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684230 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684240 4681 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684249 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684293 4681 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684303 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684311 4681 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684319 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684327 4681 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684336 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684345 4681 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.684372 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.690571 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.711515 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.712913 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ba76b-1f78-4242-97ac-3bd3faa13b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5391d29a26abd65f41dc6afe945e5f63e73dc09a6bc8d63adbce8f1c8ee151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db562e61f638140d3cf1ae6afccb6493b62767075989f5b1828b98878971a75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae238a6183c850f6bdfbf3a1885d47fe125dc23b63e32bd1d4aa487ed2805c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8702dec0514120064dbfe60b959207918ee945ecdc1280aea4e8603ab9e1cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://741166c92b268e9cf7b7663081907db064e921b622600bfd1a7135a8e7639ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.725002 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.726614 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 22 09:03:48 crc kubenswrapper[4681]: I0122 09:03:48.732255 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 22 09:03:48 crc kubenswrapper[4681]: W0122 09:03:48.755498 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-45e8425e20abf4dec69b7297ede864d997ef9e522a036811b35949ae03c66c18 WatchSource:0}: Error finding container 45e8425e20abf4dec69b7297ede864d997ef9e522a036811b35949ae03c66c18: Status 404 returned error can't find the container with id 45e8425e20abf4dec69b7297ede864d997ef9e522a036811b35949ae03c66c18 Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.187708 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.187852 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188046 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:03:50.188009591 +0000 UTC m=+21.013920166 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.188113 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.188192 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.188236 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188289 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188325 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188347 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188402 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188434 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:50.188402722 +0000 UTC m=+21.014313257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188464 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:50.188450743 +0000 UTC m=+21.014361278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188518 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188560 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:50.188549516 +0000 UTC m=+21.014460251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188680 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188723 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188737 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.188773 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:50.188763461 +0000 UTC m=+21.014673966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.400533 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 16:18:02.163933683 +0000 UTC Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.456468 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.457360 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.458036 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.458724 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.459380 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.459870 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.460528 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.461093 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.461722 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.462302 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.462939 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.463621 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.464103 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.464660 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.465214 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.465807 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.466400 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.466806 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.470297 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.470434 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.471018 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.471978 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.472710 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.473176 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.474195 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.474643 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.475764 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.476374 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.477283 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.477828 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.478607 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.479068 4681 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.479164 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.481183 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.481691 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.482076 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.483515 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.484518 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.485027 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.485971 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.486656 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.486685 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.487557 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.488142 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.489211 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.489841 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.490665 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.491169 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.492143 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.492897 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.493882 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.494377 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.495234 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.496092 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.496839 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.497707 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.502301 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db32b89-90cc-4e93-916a-257088ca3c23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T09:03:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 09:03:41.839750 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 09:03:41.842244 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3934206829/tls.crt::/tmp/serving-cert-3934206829/tls.key\\\\\\\"\\\\nI0122 09:03:47.710714 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 09:03:47.717001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 09:03:47.717039 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 09:03:47.717092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 09:03:47.717103 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 09:03:47.747693 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 09:03:47.747753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 09:03:47.747766 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 09:03:47.747773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 09:03:47.747778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 09:03:47.747788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 09:03:47.747792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 09:03:47.748115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 09:03:47.755187 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.508769 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.513629 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.526725 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.530828 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ba76b-1f78-4242-97ac-3bd3faa13b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5391d29a26abd65f41dc6afe945e5f63e73dc09a6bc8d63adbce8f1c8ee151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db562e61f638140d3cf1ae6afccb6493b62767075989f5b1828b98878971a75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae238a6183c850f6bdfbf3a1885d47fe125dc23b63e32bd1d4aa487ed2805c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8702dec0514120064dbfe60b959207918ee945ecdc1280aea4e8603ab9e1cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://741166c92b268e9cf7b7663081907db064e921b622600bfd1a7135a8e7639ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.557827 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.578076 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.599765 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"45e8425e20abf4dec69b7297ede864d997ef9e522a036811b35949ae03c66c18"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.601900 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"434b56425878374774b32db2ab8009596bdf2ecc9d840a9434c8a1de87c43ad1"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.601930 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"27aba83a5f45a8e797a1fad8a114e205e82e1d11f1f1736548a9a592f5eac535"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.602053 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b26e0edaea97b86d3cfa2d86774e243bffa49ecc70c103dd35f07ab235f47d8c"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.603359 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"673dba252289f51255a3492765c5f3474d31bfde61c8792cbd20e1f095fab039"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.603387 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0a7a60fbdf5fbfe8ecef20bdcf9dbf4cb3ba157080cbf1a79d174192c1f1387e"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.605948 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.608306 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57"} Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.608947 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.610072 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: E0122 09:03:49.617802 4681 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.629923 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.666181 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ba76b-1f78-4242-97ac-3bd3faa13b93\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5391d29a26abd65f41dc6afe945e5f63e73dc09a6bc8d63adbce8f1c8ee151e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db562e61f638140d3cf1ae6afccb6493b62767075989f5b1828b98878971a75f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae238a6183c850f6bdfbf3a1885d47fe125dc23b63e32bd1d4aa487ed2805c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8702dec0514120064dbfe60b959207918ee945ecdc1280aea4e8603ab9e1cd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://741166c92b268e9cf7b7663081907db064e921b622600bfd1a7135a8e7639ec5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bcafe40ad5f5849367e3d6baf2b079f1cebf9ab7e385bee82a5036e806c3867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://797d061f34c7f8036aa53ae87140f2faec0a8c933bed1fe5e37750766808c68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e7e5c0067eb2838a4346240c7f0a3aa9876de8c335eb7baa78a7fd9180c4b25\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.683005 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.699746 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://673dba252289f51255a3492765c5f3474d31bfde61c8792cbd20e1f095fab039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.712463 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://434b56425878374774b32db2ab8009596bdf2ecc9d840a9434c8a1de87c43ad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27aba83a5f45a8e797a1fad8a114e205e82e1d11f1f1736548a9a592f5eac535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.727245 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.743676 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37c27dc5-c174-40fb-a9b4-e611e2492fce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aef33d5e1344934edcd521a1fff74f31b6555857b22cbff3bfee112ef646dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b42d0d981cb9e194bf8ad0a8b945e348cde3f9ed52985d54c501b1768d7f8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3ffd8edb629f15c010f0b2859c4f85a43e5758f5a190aeb3b680ef0fe1200f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c980d198d01212418816af80bff402b9851a466d7036e847a78038d06d92dd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.765902 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9db32b89-90cc-4e93-916a-257088ca3c23\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-22T09:03:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0122 09:03:41.839750 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0122 09:03:41.842244 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3934206829/tls.crt::/tmp/serving-cert-3934206829/tls.key\\\\\\\"\\\\nI0122 09:03:47.710714 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0122 09:03:47.717001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0122 09:03:47.717039 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0122 09:03:47.717092 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0122 09:03:47.717103 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0122 09:03:47.747693 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0122 09:03:47.747753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 09:03:47.747766 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0122 09:03:47.747773 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0122 09:03:47.747778 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0122 09:03:47.747788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0122 09:03:47.747792 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0122 09:03:47.748115 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0122 09:03:47.755187 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-22T09:03:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-22T09:03:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-22T09:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-22T09:03:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.780578 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:49 crc kubenswrapper[4681]: I0122 09:03:49.809293 4681 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-22T09:03:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-22T09:03:49Z is after 2025-08-24T17:21:41Z" Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.201895 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.202097 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202195 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.202141944 +0000 UTC m=+23.028052489 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202331 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202369 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202397 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202497 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.202465183 +0000 UTC m=+23.028375888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202508 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.202326 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202587 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.202571446 +0000 UTC m=+23.028481981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.202726 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.202813 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.202951 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.203069 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.203027698 +0000 UTC m=+23.028938243 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.203188 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.203321 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.203359 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.203501 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.203455479 +0000 UTC m=+23.029366194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.400983 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 10:44:24.635991492 +0000 UTC Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.451908 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.452051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:50 crc kubenswrapper[4681]: I0122 09:03:50.452053 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.452143 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.452511 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:50 crc kubenswrapper[4681]: E0122 09:03:50.452550 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.401360 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:15:17.199248607 +0000 UTC Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.513919 4681 csr.go:261] certificate signing request csr-l8c7f is approved, waiting to be issued Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.524318 4681 csr.go:257] certificate signing request csr-l8c7f is issued Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.635986 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bhp2c"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.636423 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.638144 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.638569 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.638852 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.638861 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.660151 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.660125712 podStartE2EDuration="3.660125712s" podCreationTimestamp="2026-01-22 09:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:51.656541008 +0000 UTC m=+22.482451533" watchObservedRunningTime="2026-01-22 09:03:51.660125712 +0000 UTC m=+22.486036217" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.715531 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36558531-715d-4301-b818-4e812406f9f8-host\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.715568 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36558531-715d-4301-b818-4e812406f9f8-serviceca\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.715614 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwrvs\" (UniqueName: \"kubernetes.io/projected/36558531-715d-4301-b818-4e812406f9f8-kube-api-access-zwrvs\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.741708 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tbg7r"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.742041 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.746366 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.746960 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.747455 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.798394 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.7983697149999998 podStartE2EDuration="3.798369715s" podCreationTimestamp="2026-01-22 09:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:51.796183687 +0000 UTC m=+22.622094212" watchObservedRunningTime="2026-01-22 09:03:51.798369715 +0000 UTC m=+22.624280210" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.798575 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.79857154 podStartE2EDuration="2.79857154s" podCreationTimestamp="2026-01-22 09:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:51.741851212 +0000 UTC m=+22.567761717" watchObservedRunningTime="2026-01-22 09:03:51.79857154 +0000 UTC m=+22.624482045" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816308 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v9sr\" (UniqueName: \"kubernetes.io/projected/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-kube-api-access-9v9sr\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816362 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36558531-715d-4301-b818-4e812406f9f8-serviceca\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816389 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36558531-715d-4301-b818-4e812406f9f8-host\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816443 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-hosts-file\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816469 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwrvs\" (UniqueName: \"kubernetes.io/projected/36558531-715d-4301-b818-4e812406f9f8-kube-api-access-zwrvs\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.816553 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36558531-715d-4301-b818-4e812406f9f8-host\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.818168 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36558531-715d-4301-b818-4e812406f9f8-serviceca\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.819930 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xpdjl"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.820419 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.822914 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.823055 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.823189 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.823278 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.823671 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.827300 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zb7wn"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.827770 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.829704 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.829746 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.830131 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.834526 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.837583 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwrvs\" (UniqueName: \"kubernetes.io/projected/36558531-715d-4301-b818-4e812406f9f8-kube-api-access-zwrvs\") pod \"node-ca-bhp2c\" (UID: \"36558531-715d-4301-b818-4e812406f9f8\") " pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.845399 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.867402 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-knrhw"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.868010 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.871845 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.874844 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917102 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-netns\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917145 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-bin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917169 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-kubelet\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917186 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-rootfs\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917215 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-system-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917235 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-proxy-tls\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917254 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-socket-dir-parent\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917291 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-multus\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917312 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzg7\" (UniqueName: \"kubernetes.io/projected/1976858f-1664-4b36-9929-65cc8fe9d0ad-kube-api-access-9tzg7\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917394 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-os-release\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917431 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-cnibin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917479 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917498 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zg9g\" (UniqueName: \"kubernetes.io/projected/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-kube-api-access-9zg9g\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917519 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-hosts-file\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917536 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-os-release\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917556 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-daemon-config\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917574 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-hostroot\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917602 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-hosts-file\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917641 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-multus-certs\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917665 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v9sr\" (UniqueName: \"kubernetes.io/projected/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-kube-api-access-9v9sr\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917723 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjtg7\" (UniqueName: \"kubernetes.io/projected/c85970b8-70b4-44fc-a45d-8409cf53d709-kube-api-access-hjtg7\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917841 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917872 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-system-cni-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917888 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-cnibin\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917948 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917968 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-k8s-cni-cncf-io\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.917985 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-binary-copy\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.918001 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-cni-binary-copy\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.918055 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-conf-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.918096 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-etc-kubernetes\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.918128 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.935232 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v9sr\" (UniqueName: \"kubernetes.io/projected/3b499a30-4b1b-4ce8-9363-f23adf62ceb6-kube-api-access-9v9sr\") pod \"node-resolver-tbg7r\" (UID: \"3b499a30-4b1b-4ce8-9363-f23adf62ceb6\") " pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.946588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bhp2c" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.950055 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28zgq"] Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.951024 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.953991 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.954343 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.954464 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.954590 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.954608 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.956415 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 09:03:51 crc kubenswrapper[4681]: I0122 09:03:51.956563 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.000406 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vjf2g"] Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.000819 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.000881 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019377 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-system-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019422 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-proxy-tls\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019445 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019461 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019477 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jfkg\" (UniqueName: \"kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019495 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019512 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-socket-dir-parent\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019527 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-multus\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019542 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzg7\" (UniqueName: \"kubernetes.io/projected/1976858f-1664-4b36-9929-65cc8fe9d0ad-kube-api-access-9tzg7\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019560 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019592 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-os-release\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019608 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-cnibin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019624 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019643 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019670 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019685 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zg9g\" (UniqueName: \"kubernetes.io/projected/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-kube-api-access-9zg9g\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019703 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-os-release\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019719 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019738 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-daemon-config\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019755 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-hostroot\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019774 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-multus-certs\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019793 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019810 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019829 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjtg7\" (UniqueName: \"kubernetes.io/projected/c85970b8-70b4-44fc-a45d-8409cf53d709-kube-api-access-hjtg7\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019846 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019863 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019880 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019899 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-k8s-cni-cncf-io\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019919 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-system-cni-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019939 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-cnibin\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019959 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019977 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.019997 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-binary-copy\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020013 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020029 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-cni-binary-copy\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020045 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bcj\" (UniqueName: \"kubernetes.io/projected/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-kube-api-access-94bcj\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020061 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020078 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-conf-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020096 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-etc-kubernetes\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020115 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020130 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020148 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020166 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-netns\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020182 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020198 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020217 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-bin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020235 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-kubelet\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020269 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-rootfs\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020287 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020310 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020452 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-system-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.020995 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-socket-dir-parent\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021028 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-multus\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021515 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-conf-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021579 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-os-release\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021642 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-os-release\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-cnibin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.021873 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-daemon-config\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022162 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-multus-cni-dir\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022218 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-k8s-cni-cncf-io\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022245 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-system-cni-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022283 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-cnibin\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022395 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-netns\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022427 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-mcd-auth-proxy-config\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022452 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022505 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-rootfs\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022510 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-kubelet\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022547 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-var-lib-cni-bin\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022559 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-hostroot\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022559 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-host-run-multus-certs\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.022581 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1976858f-1664-4b36-9929-65cc8fe9d0ad-etc-kubernetes\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.023061 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1976858f-1664-4b36-9929-65cc8fe9d0ad-cni-binary-copy\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.023146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c85970b8-70b4-44fc-a45d-8409cf53d709-cni-binary-copy\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.023193 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c85970b8-70b4-44fc-a45d-8409cf53d709-tuning-conf-dir\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.023674 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-proxy-tls\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.039729 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjtg7\" (UniqueName: \"kubernetes.io/projected/c85970b8-70b4-44fc-a45d-8409cf53d709-kube-api-access-hjtg7\") pod \"multus-additional-cni-plugins-knrhw\" (UID: \"c85970b8-70b4-44fc-a45d-8409cf53d709\") " pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.042746 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzg7\" (UniqueName: \"kubernetes.io/projected/1976858f-1664-4b36-9929-65cc8fe9d0ad-kube-api-access-9tzg7\") pod \"multus-xpdjl\" (UID: \"1976858f-1664-4b36-9929-65cc8fe9d0ad\") " pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.042900 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zg9g\" (UniqueName: \"kubernetes.io/projected/d58a61a8-a6b2-4af6-92a6-c7bf6da6a432-kube-api-access-9zg9g\") pod \"machine-config-daemon-zb7wn\" (UID: \"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432\") " pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.052537 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbg7r" Jan 22 09:03:52 crc kubenswrapper[4681]: W0122 09:03:52.063436 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b499a30_4b1b_4ce8_9363_f23adf62ceb6.slice/crio-2a5036e70ae2f8f957ec97c982c87b7eef55e9e1a7cbc0619942d6756047fc6d WatchSource:0}: Error finding container 2a5036e70ae2f8f957ec97c982c87b7eef55e9e1a7cbc0619942d6756047fc6d: Status 404 returned error can't find the container with id 2a5036e70ae2f8f957ec97c982c87b7eef55e9e1a7cbc0619942d6756047fc6d Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.120977 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121017 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121036 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121058 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121077 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121099 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94bcj\" (UniqueName: \"kubernetes.io/projected/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-kube-api-access-94bcj\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121139 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121160 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121177 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121195 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121215 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121234 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121278 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121296 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jfkg\" (UniqueName: \"kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121330 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121347 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121363 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121398 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121419 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121443 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.121509 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122128 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122427 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122498 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122523 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122555 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122581 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122622 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122656 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122662 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.122667 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122704 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122713 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122681 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122714 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.122684 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.122810 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs podName:2e7e003a-24ec-4f48-a156-a5ed6a3afd03 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:52.622769395 +0000 UTC m=+23.448679920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs") pod "network-metrics-daemon-vjf2g" (UID: "2e7e003a-24ec-4f48-a156-a5ed6a3afd03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.123136 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.126678 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.131292 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xpdjl" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.146459 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.147627 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jfkg\" (UniqueName: \"kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg\") pod \"ovnkube-node-28zgq\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.148798 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bcj\" (UniqueName: \"kubernetes.io/projected/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-kube-api-access-94bcj\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: W0122 09:03:52.161859 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58a61a8_a6b2_4af6_92a6_c7bf6da6a432.slice/crio-bd268d52af1d6b774fea0e9edd0cae4d01071908e1988c213231d2030c7814f0 WatchSource:0}: Error finding container bd268d52af1d6b774fea0e9edd0cae4d01071908e1988c213231d2030c7814f0: Status 404 returned error can't find the container with id bd268d52af1d6b774fea0e9edd0cae4d01071908e1988c213231d2030c7814f0 Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.178408 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-knrhw" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.223574 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.223645 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.223672 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.223691 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.223842 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.223927 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.223979 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:56.223964679 +0000 UTC m=+27.049875184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.223978 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224012 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:56.22400626 +0000 UTC m=+27.049916765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224049 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:03:56.224031981 +0000 UTC m=+27.049942486 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224080 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224096 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224106 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224127 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224142 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224152 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224127 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:56.224120953 +0000 UTC m=+27.050031458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.224187 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:56.224180455 +0000 UTC m=+27.050090960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.246295 4681 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.251478 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.251508 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.251517 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.251815 4681 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.280172 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.283837 4681 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.284164 4681 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.293508 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.293546 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.293555 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.293585 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.293594 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T09:03:52Z","lastTransitionTime":"2026-01-22T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.361689 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.361725 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.361738 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.361758 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.361770 4681 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-22T09:03:52Z","lastTransitionTime":"2026-01-22T09:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.401905 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:12:29.662027371 +0000 UTC Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.401971 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.420764 4681 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.452252 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.452404 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.452803 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.452877 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.452950 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.453015 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.489253 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28"] Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.490048 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.494060 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.494223 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.525322 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-22 08:58:51 +0000 UTC, rotation deadline is 2026-11-18 06:52:42.388790991 +0000 UTC Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.525395 4681 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7197h48m49.863398699s for next certificate rotation Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.526978 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.527050 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.527233 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.527394 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nf7s\" (UniqueName: \"kubernetes.io/projected/7992548b-3d9c-4776-8cf2-c1c496fb8d67-kube-api-access-5nf7s\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.618811 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xpdjl" event={"ID":"1976858f-1664-4b36-9929-65cc8fe9d0ad","Type":"ContainerStarted","Data":"c5d3f9a31740c41885595ea6d65e085332d9c898f6c5f2080131d3840cfc5c51"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.618878 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xpdjl" event={"ID":"1976858f-1664-4b36-9929-65cc8fe9d0ad","Type":"ContainerStarted","Data":"48e5742e8bd5ac19f941b122ff6d6a25a3a16e6e578d4c1038f63ebdb1d0d71c"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.620518 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"47bb918c6db5dee95990c50fd2f792307b27072485cb79344acdc7b85e14768a"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.620577 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.620594 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"bd268d52af1d6b774fea0e9edd0cae4d01071908e1988c213231d2030c7814f0"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.621539 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbg7r" event={"ID":"3b499a30-4b1b-4ce8-9363-f23adf62ceb6","Type":"ContainerStarted","Data":"e1875d77d25cee2d487841c7f3b92b6200b5f4d553daf1cdf928e24decf9b9a0"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.621572 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbg7r" event={"ID":"3b499a30-4b1b-4ce8-9363-f23adf62ceb6","Type":"ContainerStarted","Data":"2a5036e70ae2f8f957ec97c982c87b7eef55e9e1a7cbc0619942d6756047fc6d"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.623290 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhp2c" event={"ID":"36558531-715d-4301-b818-4e812406f9f8","Type":"ContainerStarted","Data":"cb47fcdd611df99435c33be921ca463472860aa3df97610c9114accb1b51ff78"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.623330 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bhp2c" event={"ID":"36558531-715d-4301-b818-4e812406f9f8","Type":"ContainerStarted","Data":"f7683019d855f1f62ffd9f2567c1b59dd5215c0511d79a47bee977bc103e864b"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.624698 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bc2ba42ea663e3502772d7fee3004a6da7d9a2ea08bb27da247155d8535d374a"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.626783 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" exitCode=0 Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.626845 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.626867 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"afa77250c4f77835910ead4aa39d6d70c11f13582f6bfffbbd60b36b29e79354"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628024 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628395 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerStarted","Data":"2c11e647fca30d8957eb8a855a4b4957a6615e26df00cc3e02c388dda63ea3ef"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628606 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerStarted","Data":"97b7e53e9bd747a86505a297bbf314653a9b6132848baba48200ef3d58baf5de"} Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628549 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nf7s\" (UniqueName: \"kubernetes.io/projected/7992548b-3d9c-4776-8cf2-c1c496fb8d67-kube-api-access-5nf7s\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628803 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.628853 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.628932 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: E0122 09:03:52.628976 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs podName:2e7e003a-24ec-4f48-a156-a5ed6a3afd03 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:53.628956868 +0000 UTC m=+24.454867373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs") pod "network-metrics-daemon-vjf2g" (UID: "2e7e003a-24ec-4f48-a156-a5ed6a3afd03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.629002 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.629627 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-env-overrides\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.629735 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.633114 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7992548b-3d9c-4776-8cf2-c1c496fb8d67-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.651431 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nf7s\" (UniqueName: \"kubernetes.io/projected/7992548b-3d9c-4776-8cf2-c1c496fb8d67-kube-api-access-5nf7s\") pod \"ovnkube-control-plane-749d76644c-glj28\" (UID: \"7992548b-3d9c-4776-8cf2-c1c496fb8d67\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.655181 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xpdjl" podStartSLOduration=1.655158001 podStartE2EDuration="1.655158001s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:52.643055571 +0000 UTC m=+23.468966076" watchObservedRunningTime="2026-01-22 09:03:52.655158001 +0000 UTC m=+23.481068506" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.655601 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bhp2c" podStartSLOduration=1.655595862 podStartE2EDuration="1.655595862s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:52.65476234 +0000 UTC m=+23.480672845" watchObservedRunningTime="2026-01-22 09:03:52.655595862 +0000 UTC m=+23.481506367" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.667329 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tbg7r" podStartSLOduration=1.667299521 podStartE2EDuration="1.667299521s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:52.666952102 +0000 UTC m=+23.492862627" watchObservedRunningTime="2026-01-22 09:03:52.667299521 +0000 UTC m=+23.493210026" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.729930 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podStartSLOduration=1.729905786 podStartE2EDuration="1.729905786s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:52.705679805 +0000 UTC m=+23.531590310" watchObservedRunningTime="2026-01-22 09:03:52.729905786 +0000 UTC m=+23.555816291" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.854061 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" Jan 22 09:03:52 crc kubenswrapper[4681]: W0122 09:03:52.871853 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7992548b_3d9c_4776_8cf2_c1c496fb8d67.slice/crio-f0f652935d82c81e55c5623129c578dc5e28574d55ad62d2982c26eda3468a7e WatchSource:0}: Error finding container f0f652935d82c81e55c5623129c578dc5e28574d55ad62d2982c26eda3468a7e: Status 404 returned error can't find the container with id f0f652935d82c81e55c5623129c578dc5e28574d55ad62d2982c26eda3468a7e Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.873245 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp"] Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.874598 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.880513 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.881042 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.881206 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.884354 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.933032 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.933077 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.933128 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.933156 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:52 crc kubenswrapper[4681]: I0122 09:03:52.933196 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034223 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034302 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034375 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034413 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034486 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.034493 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.035340 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.039929 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.050073 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kn8bp\" (UID: \"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.217411 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" Jan 22 09:03:53 crc kubenswrapper[4681]: W0122 09:03:53.229977 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd0f89af_a84b_4368_9f18_1e2e3fdb9e0a.slice/crio-72b8ca33183d3b2c591d5f231ecc6220097e1f8e14fa040b61c478fd18f3bd98 WatchSource:0}: Error finding container 72b8ca33183d3b2c591d5f231ecc6220097e1f8e14fa040b61c478fd18f3bd98: Status 404 returned error can't find the container with id 72b8ca33183d3b2c591d5f231ecc6220097e1f8e14fa040b61c478fd18f3bd98 Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.452255 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:53 crc kubenswrapper[4681]: E0122 09:03:53.452412 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.632865 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" event={"ID":"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a","Type":"ContainerStarted","Data":"72b8ca33183d3b2c591d5f231ecc6220097e1f8e14fa040b61c478fd18f3bd98"} Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.636323 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.636360 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.638120 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="2c11e647fca30d8957eb8a855a4b4957a6615e26df00cc3e02c388dda63ea3ef" exitCode=0 Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.638169 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"2c11e647fca30d8957eb8a855a4b4957a6615e26df00cc3e02c388dda63ea3ef"} Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.639550 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" event={"ID":"7992548b-3d9c-4776-8cf2-c1c496fb8d67","Type":"ContainerStarted","Data":"f0f652935d82c81e55c5623129c578dc5e28574d55ad62d2982c26eda3468a7e"} Jan 22 09:03:53 crc kubenswrapper[4681]: I0122 09:03:53.640072 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:53 crc kubenswrapper[4681]: E0122 09:03:53.640220 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:53 crc kubenswrapper[4681]: E0122 09:03:53.640275 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs podName:2e7e003a-24ec-4f48-a156-a5ed6a3afd03 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:55.640249216 +0000 UTC m=+26.466159721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs") pod "network-metrics-daemon-vjf2g" (UID: "2e7e003a-24ec-4f48-a156-a5ed6a3afd03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.451518 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:54 crc kubenswrapper[4681]: E0122 09:03:54.452006 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.451708 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.451571 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:54 crc kubenswrapper[4681]: E0122 09:03:54.452103 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:54 crc kubenswrapper[4681]: E0122 09:03:54.452232 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.651913 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.651973 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.651982 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.651991 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.654474 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="9120f7ceead0eba6b401f6e366538ed51be3b83f147c1217b51d5d4a6b80d7da" exitCode=0 Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.654526 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"9120f7ceead0eba6b401f6e366538ed51be3b83f147c1217b51d5d4a6b80d7da"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.656858 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" event={"ID":"cd0f89af-a84b-4368-9f18-1e2e3fdb9e0a","Type":"ContainerStarted","Data":"331ebcbff41ec0143245374f02de9e21ef6c62ac1df9955a321d8c092291af47"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.660007 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" event={"ID":"7992548b-3d9c-4776-8cf2-c1c496fb8d67","Type":"ContainerStarted","Data":"156c2c2ec12c93b8a4f0037a74cff30777fca1c62d6aca5e73956d3ca7c014f4"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.660090 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" event={"ID":"7992548b-3d9c-4776-8cf2-c1c496fb8d67","Type":"ContainerStarted","Data":"c7960a6819b06a02e37ee3d9387b7ca5f7b4b9fbc830d495f7f38ef5a36ffd7f"} Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.714185 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kn8bp" podStartSLOduration=3.714157939 podStartE2EDuration="3.714157939s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:54.714097677 +0000 UTC m=+25.540008222" watchObservedRunningTime="2026-01-22 09:03:54.714157939 +0000 UTC m=+25.540068464" Jan 22 09:03:54 crc kubenswrapper[4681]: I0122 09:03:54.735541 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-glj28" podStartSLOduration=2.735511353 podStartE2EDuration="2.735511353s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:54.733913801 +0000 UTC m=+25.559824356" watchObservedRunningTime="2026-01-22 09:03:54.735511353 +0000 UTC m=+25.561421948" Jan 22 09:03:55 crc kubenswrapper[4681]: I0122 09:03:55.452248 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:55 crc kubenswrapper[4681]: E0122 09:03:55.452760 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:03:55 crc kubenswrapper[4681]: I0122 09:03:55.668373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:55 crc kubenswrapper[4681]: E0122 09:03:55.668523 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:55 crc kubenswrapper[4681]: E0122 09:03:55.668567 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs podName:2e7e003a-24ec-4f48-a156-a5ed6a3afd03 nodeName:}" failed. No retries permitted until 2026-01-22 09:03:59.668551862 +0000 UTC m=+30.494462367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs") pod "network-metrics-daemon-vjf2g" (UID: "2e7e003a-24ec-4f48-a156-a5ed6a3afd03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:55 crc kubenswrapper[4681]: I0122 09:03:55.668870 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="1541da3454df75f18ea4c06e5e9be25e9c0a1328bc0383b01f0f338303890591" exitCode=0 Jan 22 09:03:55 crc kubenswrapper[4681]: I0122 09:03:55.668918 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"1541da3454df75f18ea4c06e5e9be25e9c0a1328bc0383b01f0f338303890591"} Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.274286 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274520 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:04.274488191 +0000 UTC m=+35.100398696 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.274625 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.274672 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274804 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274871 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274887 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:04.274864711 +0000 UTC m=+35.100775226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274894 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274914 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.274956 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:04.274946853 +0000 UTC m=+35.100857358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.274707 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.275024 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275095 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275130 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275146 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275153 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275186 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:04.275179819 +0000 UTC m=+35.101090324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.275227 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:04.27519445 +0000 UTC m=+35.101105165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.451873 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.451995 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.452094 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.452021 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.452226 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:56 crc kubenswrapper[4681]: E0122 09:03:56.452431 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.680502 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.684000 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="782272c6f0b6dd361d405fea4142b03d1965b551de71bd14e3736656617063cd" exitCode=0 Jan 22 09:03:56 crc kubenswrapper[4681]: I0122 09:03:56.684048 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"782272c6f0b6dd361d405fea4142b03d1965b551de71bd14e3736656617063cd"} Jan 22 09:03:57 crc kubenswrapper[4681]: I0122 09:03:57.451629 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:57 crc kubenswrapper[4681]: E0122 09:03:57.451771 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:03:57 crc kubenswrapper[4681]: I0122 09:03:57.691927 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="bdf116040c22a6458fb8cd29e1928d19d8ed828eb3d880eca88eeab3ec702905" exitCode=0 Jan 22 09:03:57 crc kubenswrapper[4681]: I0122 09:03:57.692000 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"bdf116040c22a6458fb8cd29e1928d19d8ed828eb3d880eca88eeab3ec702905"} Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.451802 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.452155 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.452213 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:03:58 crc kubenswrapper[4681]: E0122 09:03:58.452532 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:03:58 crc kubenswrapper[4681]: E0122 09:03:58.453289 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:03:58 crc kubenswrapper[4681]: E0122 09:03:58.453416 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.711829 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerStarted","Data":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.712521 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.712628 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.724789 4681 generic.go:334] "Generic (PLEG): container finished" podID="c85970b8-70b4-44fc-a45d-8409cf53d709" containerID="dc9b143e6805d327947778e73153989e2b0ca1ff0aa2a38d352d56233cf44e6a" exitCode=0 Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.724854 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerDied","Data":"dc9b143e6805d327947778e73153989e2b0ca1ff0aa2a38d352d56233cf44e6a"} Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.750016 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.756483 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:03:58 crc kubenswrapper[4681]: I0122 09:03:58.791674 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podStartSLOduration=7.791647262 podStartE2EDuration="7.791647262s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:58.764051873 +0000 UTC m=+29.589962378" watchObservedRunningTime="2026-01-22 09:03:58.791647262 +0000 UTC m=+29.617557767" Jan 22 09:03:59 crc kubenswrapper[4681]: I0122 09:03:59.318491 4681 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 22 09:03:59 crc kubenswrapper[4681]: I0122 09:03:59.452125 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:59 crc kubenswrapper[4681]: E0122 09:03:59.453512 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:03:59 crc kubenswrapper[4681]: I0122 09:03:59.717528 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:03:59 crc kubenswrapper[4681]: E0122 09:03:59.717830 4681 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:59 crc kubenswrapper[4681]: E0122 09:03:59.717960 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs podName:2e7e003a-24ec-4f48-a156-a5ed6a3afd03 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:07.717929444 +0000 UTC m=+38.543839979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs") pod "network-metrics-daemon-vjf2g" (UID: "2e7e003a-24ec-4f48-a156-a5ed6a3afd03") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 22 09:03:59 crc kubenswrapper[4681]: I0122 09:03:59.736291 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-knrhw" event={"ID":"c85970b8-70b4-44fc-a45d-8409cf53d709","Type":"ContainerStarted","Data":"71a12d49f36126cd4c1ced75cf7e343247b4bdbb8cbde14a8dda6056417d13bc"} Jan 22 09:03:59 crc kubenswrapper[4681]: I0122 09:03:59.736587 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.451730 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.451839 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.451867 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:00 crc kubenswrapper[4681]: E0122 09:04:00.451962 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:04:00 crc kubenswrapper[4681]: E0122 09:04:00.452072 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:04:00 crc kubenswrapper[4681]: E0122 09:04:00.452204 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.741241 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.798774 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-knrhw" podStartSLOduration=9.798741429 podStartE2EDuration="9.798741429s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:03:59.762957864 +0000 UTC m=+30.588868379" watchObservedRunningTime="2026-01-22 09:04:00.798741429 +0000 UTC m=+31.624651984" Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.800592 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjf2g"] Jan 22 09:04:00 crc kubenswrapper[4681]: I0122 09:04:00.800793 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:00 crc kubenswrapper[4681]: E0122 09:04:00.800992 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:04:01 crc kubenswrapper[4681]: I0122 09:04:01.094927 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:04:02 crc kubenswrapper[4681]: I0122 09:04:02.451895 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:02 crc kubenswrapper[4681]: I0122 09:04:02.451970 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:02 crc kubenswrapper[4681]: I0122 09:04:02.452075 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:02 crc kubenswrapper[4681]: I0122 09:04:02.453230 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:02 crc kubenswrapper[4681]: E0122 09:04:02.453414 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjf2g" podUID="2e7e003a-24ec-4f48-a156-a5ed6a3afd03" Jan 22 09:04:02 crc kubenswrapper[4681]: E0122 09:04:02.453594 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 22 09:04:02 crc kubenswrapper[4681]: E0122 09:04:02.453871 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 22 09:04:02 crc kubenswrapper[4681]: E0122 09:04:02.453984 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 22 09:04:03 crc kubenswrapper[4681]: I0122 09:04:03.205938 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:04:03 crc kubenswrapper[4681]: I0122 09:04:03.206546 4681 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 22 09:04:03 crc kubenswrapper[4681]: I0122 09:04:03.229883 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.100505 4681 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.100843 4681 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.154705 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtcwj"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.155292 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.155686 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.156116 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.156164 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.156657 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.159671 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c2p8w"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.161189 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.162215 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.162911 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.163783 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164003 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164153 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164327 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164472 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164659 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.164820 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.165817 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dmfxj"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.166641 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.165995 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.168300 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.171581 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.171797 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.172923 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.186780 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.187397 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.195403 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.195630 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.195809 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.196253 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.197044 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.198530 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-glh9f"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.199395 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.200472 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.201339 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.201947 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.202040 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.201957 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.202190 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.202351 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.202547 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.202901 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.203502 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.204187 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.204489 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205414 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205495 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205611 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205719 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205762 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205890 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205974 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.206224 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.205932 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.219727 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.220120 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.220332 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.224019 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.224199 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.224704 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.224955 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225080 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225433 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225501 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225691 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225819 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.225847 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.226862 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.227005 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.227426 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.227965 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228118 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228208 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228431 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228517 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228539 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.228900 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231020 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231218 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231470 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231758 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231863 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231991 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.232013 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.231761 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.232156 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.232206 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.232149 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.233321 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64gjv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.234101 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.238366 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.246605 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.248315 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.249151 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.251284 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.251599 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fvrbp"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.251885 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-btm7s"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.252203 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.252568 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.252766 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.253006 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.255470 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.255780 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.255793 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.256304 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.256315 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.256559 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.263555 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.272930 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.273074 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.273966 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.274227 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.277841 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.277968 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.278068 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.278170 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.278867 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279554 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279765 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit-dir\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279797 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-config\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279823 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-config\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279848 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-encryption-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279879 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-images\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279901 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-service-ca\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279923 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fm7t\" (UniqueName: \"kubernetes.io/projected/3f828bb9-12a2-4148-999b-ab78f638d4b0-kube-api-access-8fm7t\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279943 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-node-pullsecrets\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279964 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.279992 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280047 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-oauth-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.280154 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:20.280134185 +0000 UTC m=+51.106044750 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280179 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280203 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-policies\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280224 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280279 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280305 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr4cp\" (UniqueName: \"kubernetes.io/projected/5c1adb5a-0fb0-4a16-a42e-0b79ec825963-kube-api-access-pr4cp\") pod \"downloads-7954f5f757-fvrbp\" (UID: \"5c1adb5a-0fb0-4a16-a42e-0b79ec825963\") " pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280339 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vv87\" (UniqueName: \"kubernetes.io/projected/5d28f6be-a6f5-4605-b071-ec453a08a7d7-kube-api-access-6vv87\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280397 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280585 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q674l\" (UniqueName: \"kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280610 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69vcp\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-kube-api-access-69vcp\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280699 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280732 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-service-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280790 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280836 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhkd\" (UniqueName: \"kubernetes.io/projected/363db0df-34ba-45e7-abce-c19cd7cc4d24-kube-api-access-zfhkd\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280866 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280899 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbkv\" (UniqueName: \"kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280929 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5wx\" (UniqueName: \"kubernetes.io/projected/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-kube-api-access-nc5wx\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.280975 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-encryption-config\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281003 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kl8k\" (UniqueName: \"kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281044 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8srj\" (UniqueName: \"kubernetes.io/projected/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-kube-api-access-t8srj\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281071 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-serving-cert\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281108 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281176 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281205 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f828bb9-12a2-4148-999b-ab78f638d4b0-serving-cert\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281226 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281276 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281304 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281340 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281361 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281378 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281405 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281428 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d28f6be-a6f5-4605-b071-ec453a08a7d7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281450 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281478 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281530 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281553 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281574 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-config\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281597 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281623 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6xv\" (UniqueName: \"kubernetes.io/projected/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-kube-api-access-5c6xv\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281648 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281682 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c3917a9c-dd78-498f-96b3-36fd8d6421c6-machine-approver-tls\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281708 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-image-import-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281729 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2a0e30-6d84-4db6-bb01-3012041a2b84-serving-cert\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281751 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl87d\" (UniqueName: \"kubernetes.io/projected/c3917a9c-dd78-498f-96b3-36fd8d6421c6-kube-api-access-pl87d\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.281774 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.281865 4681 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.281918 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:20.281899152 +0000 UTC m=+51.107809727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282246 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282281 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282295 4681 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282408 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:20.282399335 +0000 UTC m=+51.108309840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282402 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pfbg7"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282474 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-trusted-ca\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282515 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-dir\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282546 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282568 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282582 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282606 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282658 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-trusted-ca-bundle\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282708 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrqk\" (UniqueName: \"kubernetes.io/projected/c26ba6fb-8b7a-4207-82ed-3b746c50e824-kube-api-access-wkrqk\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282726 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282762 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282780 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prbds\" (UniqueName: \"kubernetes.io/projected/64b9cc65-319e-48c9-9772-0abae151c1ba-kube-api-access-prbds\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282793 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-serving-cert\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-serving-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282848 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-auth-proxy-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.282880 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282943 4681 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.282982 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:20.28297081 +0000 UTC m=+51.108881315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283003 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283023 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283039 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283059 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283077 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283100 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-client\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283114 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283189 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283116 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283348 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-oauth-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283362 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-client\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283380 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283396 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283411 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpfvm\" (UniqueName: \"kubernetes.io/projected/0e2a0e30-6d84-4db6-bb01-3012041a2b84-kube-api-access-gpfvm\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.283553 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.283566 4681 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.283575 4681 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:04:04 crc kubenswrapper[4681]: E0122 09:04:04.283598 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:20.283591476 +0000 UTC m=+51.109501971 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.283667 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.284988 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.285152 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.284992 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.285814 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.286901 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.287990 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.288493 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.288879 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wcld5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.289219 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.289460 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.289775 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.290485 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.290596 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.290673 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.291285 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.307665 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.308649 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.308753 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.309634 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.310349 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.310667 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.314950 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.317734 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.319108 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.321230 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.321996 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.322187 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.322423 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.322512 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.323313 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.327706 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.327969 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.328642 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.329053 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.329151 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.329229 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.329509 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.329941 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.332851 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.332929 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.333212 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.334820 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.335436 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.337793 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.337961 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.339709 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.341021 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.341818 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.342389 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.343150 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.343219 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.344409 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.344855 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fqc8k"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.345369 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.346869 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.347733 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.350288 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5rpfx"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.350992 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.352580 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qjngv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.353250 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.354945 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5tkc"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.355607 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.355971 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-4xk8b"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.356418 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.356600 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.356813 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.362189 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.362376 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.363612 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.364316 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-glh9f"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.365300 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.366505 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.368085 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtcwj"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.370510 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dmfxj"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384734 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit-dir\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384778 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-config\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384806 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9hj\" (UniqueName: \"kubernetes.io/projected/0a017a02-2d7a-442d-befc-943f6dc038cd-kube-api-access-8r9hj\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384828 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b987p\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-kube-api-access-b987p\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384854 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-config\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384878 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384899 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384922 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-images\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384943 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-service-ca\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384963 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-config\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.384985 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385019 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-policies\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385037 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385057 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163e41b7-e7e1-4f98-80df-ea25cca890e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385080 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385104 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63433617-6d4f-45f6-9b31-51313dbb4985-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385125 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whjzk\" (UniqueName: \"kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385164 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385187 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385210 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q674l\" (UniqueName: \"kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385232 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-service-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385252 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385300 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbkv\" (UniqueName: \"kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385326 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-metrics-certs\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385353 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/793ce360-5662-4226-9eb1-d11580d41655-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385375 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kl8k\" (UniqueName: \"kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385397 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbd6\" (UniqueName: \"kubernetes.io/projected/a2419737-5527-429d-a3b9-213a60b502cb-kube-api-access-zmbd6\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385420 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9mm\" (UniqueName: \"kubernetes.io/projected/74aff32c-9835-440b-9961-5fbcada6c96b-kube-api-access-nq9mm\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385439 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-service-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385467 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8srj\" (UniqueName: \"kubernetes.io/projected/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-kube-api-access-t8srj\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385488 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385507 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385526 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385549 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385569 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cd26dd1-2322-4cca-9d02-7d924ce912ee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385590 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-srv-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385618 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d28f6be-a6f5-4605-b071-ec453a08a7d7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385639 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f43902ae-4bee-4612-8e55-ca6ffc779ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385661 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-config\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385682 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385702 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6xv\" (UniqueName: \"kubernetes.io/projected/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-kube-api-access-5c6xv\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385724 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4cf\" (UniqueName: \"kubernetes.io/projected/2d3823b7-bc6c-4206-9a4a-347488ed67ba-kube-api-access-sv4cf\") pod \"migrator-59844c95c7-fmngz\" (UID: \"2d3823b7-bc6c-4206-9a4a-347488ed67ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385744 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385763 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385784 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zkww\" (UniqueName: \"kubernetes.io/projected/a1394b89-92be-4238-8c46-0be31f9ba572-kube-api-access-6zkww\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385804 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385826 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385859 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c3917a9c-dd78-498f-96b3-36fd8d6421c6-machine-approver-tls\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385878 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-image-import-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385897 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl87d\" (UniqueName: \"kubernetes.io/projected/c3917a9c-dd78-498f-96b3-36fd8d6421c6-kube-api-access-pl87d\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385917 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385941 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/74aff32c-9835-440b-9961-5fbcada6c96b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385959 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385977 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/793ce360-5662-4226-9eb1-d11580d41655-config\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.385997 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13879fd8-0486-46df-8a5d-ef6c81d712fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386019 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-trusted-ca\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386039 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386055 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386074 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386094 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zkc\" (UniqueName: \"kubernetes.io/projected/b3515bca-636e-4f35-b3d5-3897757ec083-kube-api-access-49zkc\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386115 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-images\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386134 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4ch\" (UniqueName: \"kubernetes.io/projected/47f80193-b6ae-4185-aa1f-320ba7f8dce9-kube-api-access-5s4ch\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386159 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrqk\" (UniqueName: \"kubernetes.io/projected/c26ba6fb-8b7a-4207-82ed-3b746c50e824-kube-api-access-wkrqk\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386187 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386213 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386229 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a2419737-5527-429d-a3b9-213a60b502cb-tmpfs\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386279 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prbds\" (UniqueName: \"kubernetes.io/projected/64b9cc65-319e-48c9-9772-0abae151c1ba-kube-api-access-prbds\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386303 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-serving-cert\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386321 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv4wh\" (UniqueName: \"kubernetes.io/projected/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-kube-api-access-kv4wh\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386347 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rrth\" (UniqueName: \"kubernetes.io/projected/13879fd8-0486-46df-8a5d-ef6c81d712fa-kube-api-access-9rrth\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386395 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9bw\" (UniqueName: \"kubernetes.io/projected/559df17a-f729-4647-9893-64aa96331ed6-kube-api-access-9h9bw\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386413 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-serving-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386437 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386457 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163e41b7-e7e1-4f98-80df-ea25cca890e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386478 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-serving-cert\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386519 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386550 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386569 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfvm\" (UniqueName: \"kubernetes.io/projected/0e2a0e30-6d84-4db6-bb01-3012041a2b84-kube-api-access-gpfvm\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386592 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-encryption-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386610 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386634 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386659 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386684 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-node-pullsecrets\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386705 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386725 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd26dd1-2322-4cca-9d02-7d924ce912ee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386745 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13879fd8-0486-46df-8a5d-ef6c81d712fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386764 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7n82\" (UniqueName: \"kubernetes.io/projected/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-kube-api-access-g7n82\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386784 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fm7t\" (UniqueName: \"kubernetes.io/projected/3f828bb9-12a2-4148-999b-ab78f638d4b0-kube-api-access-8fm7t\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386805 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-oauth-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386825 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386846 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63433617-6d4f-45f6-9b31-51313dbb4985-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386865 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386885 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvn6\" (UniqueName: \"kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386908 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr4cp\" (UniqueName: \"kubernetes.io/projected/5c1adb5a-0fb0-4a16-a42e-0b79ec825963-kube-api-access-pr4cp\") pod \"downloads-7954f5f757-fvrbp\" (UID: \"5c1adb5a-0fb0-4a16-a42e-0b79ec825963\") " pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386929 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vv87\" (UniqueName: \"kubernetes.io/projected/5d28f6be-a6f5-4605-b071-ec453a08a7d7-kube-api-access-6vv87\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386946 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.386994 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69vcp\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-kube-api-access-69vcp\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387014 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387033 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387053 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387075 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387096 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387116 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-stats-auth\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387136 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkkgg\" (UniqueName: \"kubernetes.io/projected/f43902ae-4bee-4612-8e55-ca6ffc779ec0-kube-api-access-zkkgg\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387157 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhkd\" (UniqueName: \"kubernetes.io/projected/363db0df-34ba-45e7-abce-c19cd7cc4d24-kube-api-access-zfhkd\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387187 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387208 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387228 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc5wx\" (UniqueName: \"kubernetes.io/projected/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-kube-api-access-nc5wx\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387246 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-client\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387285 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-encryption-config\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387307 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-default-certificate\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387328 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7sx\" (UniqueName: \"kubernetes.io/projected/ac71e92a-9043-4004-b570-a0ce86bf2e76-kube-api-access-zw7sx\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387353 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-serving-cert\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387373 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387395 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f828bb9-12a2-4148-999b-ab78f638d4b0-serving-cert\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387418 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387442 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqq6\" (UniqueName: \"kubernetes.io/projected/a6477b5b-15c2-458c-ba42-4be5ca90acb6-kube-api-access-njqq6\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387482 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47f80193-b6ae-4185-aa1f-320ba7f8dce9-service-ca-bundle\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387504 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63433617-6d4f-45f6-9b31-51313dbb4985-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387526 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387546 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387564 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387585 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387608 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2a0e30-6d84-4db6-bb01-3012041a2b84-serving-cert\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387629 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/793ce360-5662-4226-9eb1-d11580d41655-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387745 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-dir\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387810 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387841 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a017a02-2d7a-442d-befc-943f6dc038cd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.387904 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit-dir\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.389553 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.392850 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.395035 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-config\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.396768 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-config\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.397205 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-config\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.398117 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.398567 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5d28f6be-a6f5-4605-b071-ec453a08a7d7-images\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.398970 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-policies\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.399460 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-serving-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.400322 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.400493 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.400531 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-serving-cert\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.400917 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.401428 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-image-import-ca\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.401924 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.401957 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.402628 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9d4hw"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.402906 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.402993 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403302 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403367 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403418 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-trusted-ca-bundle\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403499 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403527 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403556 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd26dd1-2322-4cca-9d02-7d924ce912ee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403588 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5wg\" (UniqueName: \"kubernetes.io/projected/49d5a414-d020-4687-8d32-3141061d0c80-kube-api-access-xr5wg\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403598 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403619 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-auth-proxy-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403631 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403645 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-audit\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403667 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403718 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403748 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403771 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-client\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403795 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a017a02-2d7a-442d-befc-943f6dc038cd-proxy-tls\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403817 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-oauth-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-client\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.404051 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.403553 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.406905 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.408754 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.408914 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-available-featuregates\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.409880 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.410210 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c3917a9c-dd78-498f-96b3-36fd8d6421c6-auth-proxy-config\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.410901 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/363db0df-34ba-45e7-abce-c19cd7cc4d24-node-pullsecrets\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.411170 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p99f"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.411552 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.411922 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.411983 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64b9cc65-319e-48c9-9772-0abae151c1ba-audit-dir\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.402697 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64b9cc65-319e-48c9-9772-0abae151c1ba-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.414932 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-oauth-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.417042 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.417316 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-service-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.417584 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.419343 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e2a0e30-6d84-4db6-bb01-3012041a2b84-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.419533 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.420188 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f828bb9-12a2-4148-999b-ab78f638d4b0-trusted-ca\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.420382 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e2a0e30-6d84-4db6-bb01-3012041a2b84-serving-cert\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.421903 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363db0df-34ba-45e7-abce-c19cd7cc4d24-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.423555 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.424512 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-serving-cert\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.427209 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.427297 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-service-ca\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.427583 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.427759 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.428176 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-etcd-client\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.428297 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d28f6be-a6f5-4605-b071-ec453a08a7d7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.428449 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.428647 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c3917a9c-dd78-498f-96b3-36fd8d6421c6-machine-approver-tls\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.429012 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-serving-cert\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.429448 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64b9cc65-319e-48c9-9772-0abae151c1ba-encryption-config\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.429658 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-etcd-client\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.430157 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-serving-cert\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.430514 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.432906 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.439784 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.440855 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.441146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.441395 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26ba6fb-8b7a-4207-82ed-3b746c50e824-console-oauth-config\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.441701 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.442465 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/363db0df-34ba-45e7-abce-c19cd7cc4d24-encryption-config\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.443230 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26ba6fb-8b7a-4207-82ed-3b746c50e824-trusted-ca-bundle\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.448405 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.448446 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.449531 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.451556 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.452182 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f828bb9-12a2-4148-999b-ab78f638d4b0-serving-cert\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.452649 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.452768 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-btm7s"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.452912 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.452950 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.453185 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.453328 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.453481 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.454375 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c2p8w"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.455303 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.457271 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.458498 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fvrbp"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.458803 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.459585 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.460587 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64gjv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.463680 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9d4hw"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.463708 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96pmp"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.464610 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pfbg7"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.464714 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.467030 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.467130 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.468557 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.469548 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.471139 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.473281 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5rpfx"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.474324 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.475401 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.476381 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.477465 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5tkc"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.477806 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.478683 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.479719 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.480713 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.481930 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.483196 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qjngv"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.484158 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96pmp"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.485138 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p99f"] Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.498279 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504598 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504645 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd26dd1-2322-4cca-9d02-7d924ce912ee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504679 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5wg\" (UniqueName: \"kubernetes.io/projected/49d5a414-d020-4687-8d32-3141061d0c80-kube-api-access-xr5wg\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504833 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504867 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a017a02-2d7a-442d-befc-943f6dc038cd-proxy-tls\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504909 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9hj\" (UniqueName: \"kubernetes.io/projected/0a017a02-2d7a-442d-befc-943f6dc038cd-kube-api-access-8r9hj\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.504944 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b987p\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-kube-api-access-b987p\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.505011 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.505059 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.505106 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-config\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.505815 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-config\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163e41b7-e7e1-4f98-80df-ea25cca890e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506367 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506389 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63433617-6d4f-45f6-9b31-51313dbb4985-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506412 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whjzk\" (UniqueName: \"kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506454 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506487 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-metrics-certs\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506504 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/793ce360-5662-4226-9eb1-d11580d41655-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506529 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbd6\" (UniqueName: \"kubernetes.io/projected/a2419737-5527-429d-a3b9-213a60b502cb-kube-api-access-zmbd6\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506568 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9mm\" (UniqueName: \"kubernetes.io/projected/74aff32c-9835-440b-9961-5fbcada6c96b-kube-api-access-nq9mm\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506584 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-service-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506612 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506628 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cd26dd1-2322-4cca-9d02-7d924ce912ee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506647 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-srv-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506674 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f43902ae-4bee-4612-8e55-ca6ffc779ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506702 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4cf\" (UniqueName: \"kubernetes.io/projected/2d3823b7-bc6c-4206-9a4a-347488ed67ba-kube-api-access-sv4cf\") pod \"migrator-59844c95c7-fmngz\" (UID: \"2d3823b7-bc6c-4206-9a4a-347488ed67ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506721 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zkww\" (UniqueName: \"kubernetes.io/projected/a1394b89-92be-4238-8c46-0be31f9ba572-kube-api-access-6zkww\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506737 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506756 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506773 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/793ce360-5662-4226-9eb1-d11580d41655-config\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506802 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/74aff32c-9835-440b-9961-5fbcada6c96b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506829 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13879fd8-0486-46df-8a5d-ef6c81d712fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506859 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zkc\" (UniqueName: \"kubernetes.io/projected/b3515bca-636e-4f35-b3d5-3897757ec083-kube-api-access-49zkc\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506880 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-images\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506904 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4ch\" (UniqueName: \"kubernetes.io/projected/47f80193-b6ae-4185-aa1f-320ba7f8dce9-kube-api-access-5s4ch\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506942 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506958 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.506976 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a2419737-5527-429d-a3b9-213a60b502cb-tmpfs\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507009 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv4wh\" (UniqueName: \"kubernetes.io/projected/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-kube-api-access-kv4wh\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507032 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rrth\" (UniqueName: \"kubernetes.io/projected/13879fd8-0486-46df-8a5d-ef6c81d712fa-kube-api-access-9rrth\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507062 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9bw\" (UniqueName: \"kubernetes.io/projected/559df17a-f729-4647-9893-64aa96331ed6-kube-api-access-9h9bw\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507088 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163e41b7-e7e1-4f98-80df-ea25cca890e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507112 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-serving-cert\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507147 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507155 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-service-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507165 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507186 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507208 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7n82\" (UniqueName: \"kubernetes.io/projected/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-kube-api-access-g7n82\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507233 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd26dd1-2322-4cca-9d02-7d924ce912ee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507254 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13879fd8-0486-46df-8a5d-ef6c81d712fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507285 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63433617-6d4f-45f6-9b31-51313dbb4985-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507279 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507309 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvn6\" (UniqueName: \"kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507360 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507377 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507403 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-stats-auth\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507424 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkkgg\" (UniqueName: \"kubernetes.io/projected/f43902ae-4bee-4612-8e55-ca6ffc779ec0-kube-api-access-zkkgg\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507442 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507464 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-client\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507482 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507499 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-default-certificate\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507516 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7sx\" (UniqueName: \"kubernetes.io/projected/ac71e92a-9043-4004-b570-a0ce86bf2e76-kube-api-access-zw7sx\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507539 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqq6\" (UniqueName: \"kubernetes.io/projected/a6477b5b-15c2-458c-ba42-4be5ca90acb6-kube-api-access-njqq6\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507585 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47f80193-b6ae-4185-aa1f-320ba7f8dce9-service-ca-bundle\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507608 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63433617-6d4f-45f6-9b31-51313dbb4985-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/793ce360-5662-4226-9eb1-d11580d41655-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507655 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a017a02-2d7a-442d-befc-943f6dc038cd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.507676 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.508135 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a2419737-5527-429d-a3b9-213a60b502cb-tmpfs\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.508184 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.508698 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.508725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163e41b7-e7e1-4f98-80df-ea25cca890e5-trusted-ca\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.509165 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-ca\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.509528 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0a017a02-2d7a-442d-befc-943f6dc038cd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.509738 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163e41b7-e7e1-4f98-80df-ea25cca890e5-metrics-tls\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.510747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-etcd-client\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.511844 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1394b89-92be-4238-8c46-0be31f9ba572-serving-cert\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.519006 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.531472 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13879fd8-0486-46df-8a5d-ef6c81d712fa-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.539297 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.547671 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13879fd8-0486-46df-8a5d-ef6c81d712fa-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.558848 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.578633 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.599058 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.619729 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.629693 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a017a02-2d7a-442d-befc-943f6dc038cd-proxy-tls\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.638855 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.668095 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.674011 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-default-certificate\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.677873 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.682879 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-stats-auth\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.698292 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.709726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/47f80193-b6ae-4185-aa1f-320ba7f8dce9-metrics-certs\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.718490 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.719388 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47f80193-b6ae-4185-aa1f-320ba7f8dce9-service-ca-bundle\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.738387 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.758393 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.778707 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.798636 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.818537 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.839113 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.858457 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.869238 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cd26dd1-2322-4cca-9d02-7d924ce912ee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.878932 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.888501 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cd26dd1-2322-4cca-9d02-7d924ce912ee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.898235 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.919461 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.939112 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.950245 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/793ce360-5662-4226-9eb1-d11580d41655-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.959042 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.967575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/793ce360-5662-4226-9eb1-d11580d41655-config\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.979433 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.991667 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63433617-6d4f-45f6-9b31-51313dbb4985-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:04 crc kubenswrapper[4681]: I0122 09:04:04.999224 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.019040 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.030036 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63433617-6d4f-45f6-9b31-51313dbb4985-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.039320 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.079715 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.099594 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.118436 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.130176 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.138620 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.146566 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.159478 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.178625 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.199728 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.210739 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f43902ae-4bee-4612-8e55-ca6ffc779ec0-proxy-tls\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.218516 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.238966 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.258648 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.268844 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f43902ae-4bee-4612-8e55-ca6ffc779ec0-images\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.279091 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.298463 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.304796 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/74aff32c-9835-440b-9961-5fbcada6c96b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.320077 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.336975 4681 request.go:700] Waited for 1.003559013s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.348150 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.351212 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.359862 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.380062 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.393530 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.399302 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.420592 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.439786 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.465813 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.479761 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.491611 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-srv-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.500906 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.505356 4681 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.505449 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert podName:49d5a414-d020-4687-8d32-3141061d0c80 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.005423447 +0000 UTC m=+36.831333962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-9kg44" (UID: "49d5a414-d020-4687-8d32-3141061d0c80") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.505529 4681 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.505611 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert podName:ac71e92a-9043-4004-b570-a0ce86bf2e76 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.005590361 +0000 UTC m=+36.831500866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert") pod "service-ca-operator-777779d784-m2xbf" (UID: "ac71e92a-9043-4004-b570-a0ce86bf2e76") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506686 4681 secret.go:188] Couldn't get secret openshift-dns-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506730 4681 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506773 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls podName:a6477b5b-15c2-458c-ba42-4be5ca90acb6 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.006757392 +0000 UTC m=+36.832667997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls") pod "dns-operator-744455d44c-s5tkc" (UID: "a6477b5b-15c2-458c-ba42-4be5ca90acb6") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506785 4681 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506800 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs podName:293c5501-59eb-41ae-b7c4-19b7cffb2b6b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.006788923 +0000 UTC m=+36.832699568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs") pod "machine-config-server-fqc8k" (UID: "293c5501-59eb-41ae-b7c4-19b7cffb2b6b") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.506836 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token podName:293c5501-59eb-41ae-b7c4-19b7cffb2b6b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.006822874 +0000 UTC m=+36.832733389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token") pod "machine-config-server-fqc8k" (UID: "293c5501-59eb-41ae-b7c4-19b7cffb2b6b") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.508035 4681 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.508078 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config podName:ac71e92a-9043-4004-b570-a0ce86bf2e76 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.008069837 +0000 UTC m=+36.833980342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config") pod "service-ca-operator-777779d784-m2xbf" (UID: "ac71e92a-9043-4004-b570-a0ce86bf2e76") : failed to sync configmap cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.508038 4681 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.508112 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert podName:a2419737-5527-429d-a3b9-213a60b502cb nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.008105898 +0000 UTC m=+36.834016403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert") pod "packageserver-d55dfcdfc-m5r66" (UID: "a2419737-5527-429d-a3b9-213a60b502cb") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509181 4681 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509227 4681 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509242 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert podName:559df17a-f729-4647-9893-64aa96331ed6 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.009226537 +0000 UTC m=+36.835137142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert") pod "ingress-canary-5rpfx" (UID: "559df17a-f729-4647-9893-64aa96331ed6") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509291 4681 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509340 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert podName:b3515bca-636e-4f35-b3d5-3897757ec083 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.009291109 +0000 UTC m=+36.835201724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert") pod "catalog-operator-68c6474976-qxrpc" (UID: "b3515bca-636e-4f35-b3d5-3897757ec083") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509367 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert podName:a2419737-5527-429d-a3b9-213a60b502cb nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.009354011 +0000 UTC m=+36.835264536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert") pod "packageserver-d55dfcdfc-m5r66" (UID: "a2419737-5527-429d-a3b9-213a60b502cb") : failed to sync secret cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509368 4681 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: E0122 09:04:05.509426 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist podName:5334d917-0f71-4e93-ae7a-2a169f3b7a34 nodeName:}" failed. No retries permitted until 2026-01-22 09:04:06.009414052 +0000 UTC m=+36.835324567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-4xk8b" (UID: "5334d917-0f71-4e93-ae7a-2a169f3b7a34") : failed to sync configmap cache: timed out waiting for the condition Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.518953 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.538658 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.558686 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.578704 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.598725 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.618840 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.640103 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.659195 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.680229 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.699914 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.720538 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.739173 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.759469 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.779426 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.799416 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.818480 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.839634 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.859244 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.878724 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.899548 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.919393 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.939493 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.958736 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.979190 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 09:04:05 crc kubenswrapper[4681]: I0122 09:04:05.999144 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.019360 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.037816 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.037891 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038156 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038226 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038293 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038364 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038491 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038562 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038632 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038687 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.038900 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.040643 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.043011 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/49d5a414-d020-4687-8d32-3141061d0c80-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.043710 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a6477b5b-15c2-458c-ba42-4be5ca90acb6-metrics-tls\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.044384 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-node-bootstrap-token\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.044405 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/559df17a-f729-4647-9893-64aa96331ed6-cert\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.044732 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b3515bca-636e-4f35-b3d5-3897757ec083-profile-collector-cert\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.045839 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-certs\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.046842 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-webhook-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.047070 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2419737-5527-429d-a3b9-213a60b502cb-apiservice-cert\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.047360 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac71e92a-9043-4004-b570-a0ce86bf2e76-serving-cert\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.050049 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac71e92a-9043-4004-b570-a0ce86bf2e76-config\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.058765 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.115661 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrqk\" (UniqueName: \"kubernetes.io/projected/c26ba6fb-8b7a-4207-82ed-3b746c50e824-kube-api-access-wkrqk\") pod \"console-f9d7485db-glh9f\" (UID: \"c26ba6fb-8b7a-4207-82ed-3b746c50e824\") " pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.133595 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prbds\" (UniqueName: \"kubernetes.io/projected/64b9cc65-319e-48c9-9772-0abae151c1ba-kube-api-access-prbds\") pod \"apiserver-7bbb656c7d-crvnv\" (UID: \"64b9cc65-319e-48c9-9772-0abae151c1ba\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.152819 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8srj\" (UniqueName: \"kubernetes.io/projected/3566a8b9-503a-4c4b-a008-8bfeb5e38fa8-kube-api-access-t8srj\") pod \"openshift-apiserver-operator-796bbdcf4f-ktf4x\" (UID: \"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.154738 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.165562 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.178812 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc5wx\" (UniqueName: \"kubernetes.io/projected/2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa-kube-api-access-nc5wx\") pod \"cluster-samples-operator-665b6dd947-klxjr\" (UID: \"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.197055 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69vcp\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-kube-api-access-69vcp\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.215394 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b49c3e1-6d6d-498b-81a7-f40174f7f7c1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vwwtm\" (UID: \"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.236339 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbkv\" (UniqueName: \"kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv\") pod \"oauth-openshift-558db77b4-dmfxj\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.256384 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr4cp\" (UniqueName: \"kubernetes.io/projected/5c1adb5a-0fb0-4a16-a42e-0b79ec825963-kube-api-access-pr4cp\") pod \"downloads-7954f5f757-fvrbp\" (UID: \"5c1adb5a-0fb0-4a16-a42e-0b79ec825963\") " pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.278503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vv87\" (UniqueName: \"kubernetes.io/projected/5d28f6be-a6f5-4605-b071-ec453a08a7d7-kube-api-access-6vv87\") pod \"machine-api-operator-5694c8668f-mtcwj\" (UID: \"5d28f6be-a6f5-4605-b071-ec453a08a7d7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.283573 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.288780 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.290216 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.296432 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kl8k\" (UniqueName: \"kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k\") pod \"controller-manager-879f6c89f-6rt8z\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.316369 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpfvm\" (UniqueName: \"kubernetes.io/projected/0e2a0e30-6d84-4db6-bb01-3012041a2b84-kube-api-access-gpfvm\") pod \"authentication-operator-69f744f599-btm7s\" (UID: \"0e2a0e30-6d84-4db6-bb01-3012041a2b84\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.322283 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.332664 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6xv\" (UniqueName: \"kubernetes.io/projected/5ecda186-bc95-4f85-89cb-1c3fcc1354ce-kube-api-access-5c6xv\") pod \"openshift-config-operator-7777fb866f-4dfqb\" (UID: \"5ecda186-bc95-4f85-89cb-1c3fcc1354ce\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.339389 4681 request.go:700] Waited for 1.935611957s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.340620 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.357430 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.360298 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl87d\" (UniqueName: \"kubernetes.io/projected/c3917a9c-dd78-498f-96b3-36fd8d6421c6-kube-api-access-pl87d\") pod \"machine-approver-56656f9798-4rq95\" (UID: \"c3917a9c-dd78-498f-96b3-36fd8d6421c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.378541 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.381646 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fm7t\" (UniqueName: \"kubernetes.io/projected/3f828bb9-12a2-4148-999b-ab78f638d4b0-kube-api-access-8fm7t\") pod \"console-operator-58897d9998-64gjv\" (UID: \"3f828bb9-12a2-4148-999b-ab78f638d4b0\") " pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.393524 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-glh9f"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.397798 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhkd\" (UniqueName: \"kubernetes.io/projected/363db0df-34ba-45e7-abce-c19cd7cc4d24-kube-api-access-zfhkd\") pod \"apiserver-76f77b778f-c2p8w\" (UID: \"363db0df-34ba-45e7-abce-c19cd7cc4d24\") " pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:06 crc kubenswrapper[4681]: W0122 09:04:06.406086 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26ba6fb_8b7a_4207_82ed_3b746c50e824.slice/crio-d578ec1629593b13a803b7925d0fec3176ab5bae1af87eef5044a24c299abff9 WatchSource:0}: Error finding container d578ec1629593b13a803b7925d0fec3176ab5bae1af87eef5044a24c299abff9: Status 404 returned error can't find the container with id d578ec1629593b13a803b7925d0fec3176ab5bae1af87eef5044a24c299abff9 Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.418817 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.419895 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q674l\" (UniqueName: \"kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l\") pod \"route-controller-manager-6576b87f9c-ckctb\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.439491 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.440731 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.445674 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.458342 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.475642 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.479028 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.496537 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.498720 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.519176 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.539229 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.558928 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.561379 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.577218 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.579380 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.591522 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fvrbp"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.599131 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 09:04:06 crc kubenswrapper[4681]: W0122 09:04:06.609431 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1adb5a_0fb0_4a16_a42e_0b79ec825963.slice/crio-d3874650ac0cc26a1ee5031c1e388378e96c93a01441502f834a4e20088cd90d WatchSource:0}: Error finding container d3874650ac0cc26a1ee5031c1e388378e96c93a01441502f834a4e20088cd90d: Status 404 returned error can't find the container with id d3874650ac0cc26a1ee5031c1e388378e96c93a01441502f834a4e20088cd90d Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.619112 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.638970 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.660053 4681 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.679069 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.689309 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.710162 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.712184 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.716532 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.724011 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5wg\" (UniqueName: \"kubernetes.io/projected/49d5a414-d020-4687-8d32-3141061d0c80-kube-api-access-xr5wg\") pod \"package-server-manager-789f6589d5-9kg44\" (UID: \"49d5a414-d020-4687-8d32-3141061d0c80\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.734786 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9hj\" (UniqueName: \"kubernetes.io/projected/0a017a02-2d7a-442d-befc-943f6dc038cd-kube-api-access-8r9hj\") pod \"machine-config-controller-84d6567774-gdz6n\" (UID: \"0a017a02-2d7a-442d-befc-943f6dc038cd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.760401 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b987p\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-kube-api-access-b987p\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.762698 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dmfxj"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.767493 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" event={"ID":"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8","Type":"ContainerStarted","Data":"7badf866b1a7f71889d7bd6d1cd8bbb625c1b9deb87109613103ad0c64a5cd3f"} Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.772556 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63433617-6d4f-45f6-9b31-51313dbb4985-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgbd\" (UID: \"63433617-6d4f-45f6-9b31-51313dbb4985\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.779640 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fvrbp" event={"ID":"5c1adb5a-0fb0-4a16-a42e-0b79ec825963","Type":"ContainerStarted","Data":"d3874650ac0cc26a1ee5031c1e388378e96c93a01441502f834a4e20088cd90d"} Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.785144 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" event={"ID":"64b9cc65-319e-48c9-9772-0abae151c1ba","Type":"ContainerStarted","Data":"81dd53b5733a1c74dcb6238ed2cd86fc1d8ab222e04e8d69c2d015a3f477dea5"} Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.786446 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-glh9f" event={"ID":"c26ba6fb-8b7a-4207-82ed-3b746c50e824","Type":"ContainerStarted","Data":"d578ec1629593b13a803b7925d0fec3176ab5bae1af87eef5044a24c299abff9"} Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.787713 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" event={"ID":"c3917a9c-dd78-498f-96b3-36fd8d6421c6","Type":"ContainerStarted","Data":"f8196710cd2da1dbb59bd1d80aafd71042691f7a3c331d7dbe97b8df1c311a09"} Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.793205 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbd6\" (UniqueName: \"kubernetes.io/projected/a2419737-5527-429d-a3b9-213a60b502cb-kube-api-access-zmbd6\") pod \"packageserver-d55dfcdfc-m5r66\" (UID: \"a2419737-5527-429d-a3b9-213a60b502cb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.809222 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.811904 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9mm\" (UniqueName: \"kubernetes.io/projected/74aff32c-9835-440b-9961-5fbcada6c96b-kube-api-access-nq9mm\") pod \"control-plane-machine-set-operator-78cbb6b69f-gfd8p\" (UID: \"74aff32c-9835-440b-9961-5fbcada6c96b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.833136 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cd26dd1-2322-4cca-9d02-7d924ce912ee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-splh5\" (UID: \"7cd26dd1-2322-4cca-9d02-7d924ce912ee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.833706 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-btm7s"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.842198 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mtcwj"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.853044 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whjzk\" (UniqueName: \"kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk\") pod \"marketplace-operator-79b997595-qhf7x\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.873406 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4cf\" (UniqueName: \"kubernetes.io/projected/2d3823b7-bc6c-4206-9a4a-347488ed67ba-kube-api-access-sv4cf\") pod \"migrator-59844c95c7-fmngz\" (UID: \"2d3823b7-bc6c-4206-9a4a-347488ed67ba\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.892145 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zkww\" (UniqueName: \"kubernetes.io/projected/a1394b89-92be-4238-8c46-0be31f9ba572-kube-api-access-6zkww\") pod \"etcd-operator-b45778765-pfbg7\" (UID: \"a1394b89-92be-4238-8c46-0be31f9ba572\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.901998 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.912346 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zkc\" (UniqueName: \"kubernetes.io/projected/b3515bca-636e-4f35-b3d5-3897757ec083-kube-api-access-49zkc\") pod \"catalog-operator-68c6474976-qxrpc\" (UID: \"b3515bca-636e-4f35-b3d5-3897757ec083\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.943961 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.946917 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv4wh\" (UniqueName: \"kubernetes.io/projected/293c5501-59eb-41ae-b7c4-19b7cffb2b6b-kube-api-access-kv4wh\") pod \"machine-config-server-fqc8k\" (UID: \"293c5501-59eb-41ae-b7c4-19b7cffb2b6b\") " pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.950771 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.952255 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64gjv"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.964103 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.964464 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.965702 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rrth\" (UniqueName: \"kubernetes.io/projected/13879fd8-0486-46df-8a5d-ef6c81d712fa-kube-api-access-9rrth\") pod \"openshift-controller-manager-operator-756b6f6bc6-zknvr\" (UID: \"13879fd8-0486-46df-8a5d-ef6c81d712fa\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.975088 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163e41b7-e7e1-4f98-80df-ea25cca890e5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cqzzn\" (UID: \"163e41b7-e7e1-4f98-80df-ea25cca890e5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.987087 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.992022 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.995688 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkkgg\" (UniqueName: \"kubernetes.io/projected/f43902ae-4bee-4612-8e55-ca6ffc779ec0-kube-api-access-zkkgg\") pod \"machine-config-operator-74547568cd-hhrz2\" (UID: \"f43902ae-4bee-4612-8e55-ca6ffc779ec0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:06 crc kubenswrapper[4681]: I0122 09:04:06.999470 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.008116 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.013552 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.015481 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7n82\" (UniqueName: \"kubernetes.io/projected/f3a235c2-8a4c-41c8-b2c2-97ddad58da8b-kube-api-access-g7n82\") pod \"kube-storage-version-migrator-operator-b67b599dd-r5cb5\" (UID: \"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.036177 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fqc8k" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.045113 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.064326 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqq6\" (UniqueName: \"kubernetes.io/projected/a6477b5b-15c2-458c-ba42-4be5ca90acb6-kube-api-access-njqq6\") pod \"dns-operator-744455d44c-s5tkc\" (UID: \"a6477b5b-15c2-458c-ba42-4be5ca90acb6\") " pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.067558 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4ch\" (UniqueName: \"kubernetes.io/projected/47f80193-b6ae-4185-aa1f-320ba7f8dce9-kube-api-access-5s4ch\") pod \"router-default-5444994796-wcld5\" (UID: \"47f80193-b6ae-4185-aa1f-320ba7f8dce9\") " pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:07 crc kubenswrapper[4681]: W0122 09:04:07.078241 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b49c3e1_6d6d_498b_81a7_f40174f7f7c1.slice/crio-0a4f543c3042815d0149ebfcf5ef6c65f7fb6552f40c621e713cb2579f276f3a WatchSource:0}: Error finding container 0a4f543c3042815d0149ebfcf5ef6c65f7fb6552f40c621e713cb2579f276f3a: Status 404 returned error can't find the container with id 0a4f543c3042815d0149ebfcf5ef6c65f7fb6552f40c621e713cb2579f276f3a Jan 22 09:04:07 crc kubenswrapper[4681]: W0122 09:04:07.083201 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bd3021_b5e7_4c2c_8152_6f0450cea681.slice/crio-c910b5bb75ecacc968612d71a2d33e815b28083b719d764393e78e0a821a15a0 WatchSource:0}: Error finding container c910b5bb75ecacc968612d71a2d33e815b28083b719d764393e78e0a821a15a0: Status 404 returned error can't find the container with id c910b5bb75ecacc968612d71a2d33e815b28083b719d764393e78e0a821a15a0 Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.084881 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvn6\" (UniqueName: \"kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6\") pod \"cni-sysctl-allowlist-ds-4xk8b\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.086236 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" Jan 22 09:04:07 crc kubenswrapper[4681]: W0122 09:04:07.098194 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d28f6be_a6f5_4605_b071_ec453a08a7d7.slice/crio-2d4c87bddf78693c4ef3c7bc2645c217145a0b93a02999bfb5b155e62482d1e6 WatchSource:0}: Error finding container 2d4c87bddf78693c4ef3c7bc2645c217145a0b93a02999bfb5b155e62482d1e6: Status 404 returned error can't find the container with id 2d4c87bddf78693c4ef3c7bc2645c217145a0b93a02999bfb5b155e62482d1e6 Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.104034 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7sx\" (UniqueName: \"kubernetes.io/projected/ac71e92a-9043-4004-b570-a0ce86bf2e76-kube-api-access-zw7sx\") pod \"service-ca-operator-777779d784-m2xbf\" (UID: \"ac71e92a-9043-4004-b570-a0ce86bf2e76\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.118076 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/793ce360-5662-4226-9eb1-d11580d41655-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pnhr5\" (UID: \"793ce360-5662-4226-9eb1-d11580d41655\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.141707 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9bw\" (UniqueName: \"kubernetes.io/projected/559df17a-f729-4647-9893-64aa96331ed6-kube-api-access-9h9bw\") pod \"ingress-canary-5rpfx\" (UID: \"559df17a-f729-4647-9893-64aa96331ed6\") " pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170691 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170761 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170842 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170870 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqlc\" (UniqueName: \"kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170895 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170924 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmr6h\" (UniqueName: \"kubernetes.io/projected/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-kube-api-access-lmr6h\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.170951 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-srv-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171019 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8b8\" (UniqueName: \"kubernetes.io/projected/2b7db3ea-7250-4e34-b6a7-7fb14b392441-kube-api-access-bd8b8\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171054 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171087 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171114 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-key\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171407 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171498 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171625 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171803 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171870 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-cabundle\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.171954 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dr4\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.174105 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:07.674085721 +0000 UTC m=+38.499996236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.195203 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.210552 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.230683 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.257475 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.270844 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.276515 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.276797 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:07.776756984 +0000 UTC m=+38.602667499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.276920 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8b8\" (UniqueName: \"kubernetes.io/projected/2b7db3ea-7250-4e34-b6a7-7fb14b392441-kube-api-access-bd8b8\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.276979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277013 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-mountpoint-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277085 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpjv\" (UniqueName: \"kubernetes.io/projected/4db34d31-e1cc-4afd-afb4-b0a5a535053d-kube-api-access-mrpjv\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277139 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277182 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkk7v\" (UniqueName: \"kubernetes.io/projected/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-kube-api-access-zkk7v\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277214 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-key\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277334 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277435 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-registration-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277549 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-plugins-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277598 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277906 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277924 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-cabundle\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.277943 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-socket-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278129 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-csi-data-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278409 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-metrics-tls\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278445 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dr4\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278463 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278681 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.278526 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4db34d31-e1cc-4afd-afb4-b0a5a535053d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.279958 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:07.779934017 +0000 UTC m=+38.605844532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280006 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280115 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2fm\" (UniqueName: \"kubernetes.io/projected/c0081e2b-bec4-458a-93cb-a6d580aa9558-kube-api-access-9x2fm\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280147 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280192 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqlc\" (UniqueName: \"kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280228 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-config-volume\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280294 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280315 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmr6h\" (UniqueName: \"kubernetes.io/projected/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-kube-api-access-lmr6h\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.280365 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-srv-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.282965 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.284379 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.291682 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.293590 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.294918 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.294947 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.296366 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.296370 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-cabundle\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.296452 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2b7db3ea-7250-4e34-b6a7-7fb14b392441-signing-key\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.297016 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-srv-cert\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.306841 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.324564 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8b8\" (UniqueName: \"kubernetes.io/projected/2b7db3ea-7250-4e34-b6a7-7fb14b392441-kube-api-access-bd8b8\") pod \"service-ca-9c57cc56f-qjngv\" (UID: \"2b7db3ea-7250-4e34-b6a7-7fb14b392441\") " pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.351533 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5rpfx" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.360223 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqlc\" (UniqueName: \"kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc\") pod \"collect-profiles-29484540-h6x5h\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.367320 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.379841 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.379874 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385175 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-registration-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385367 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-plugins-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385418 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-socket-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385442 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-csi-data-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385463 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-metrics-tls\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385487 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4db34d31-e1cc-4afd-afb4-b0a5a535053d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385508 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2fm\" (UniqueName: \"kubernetes.io/projected/c0081e2b-bec4-458a-93cb-a6d580aa9558-kube-api-access-9x2fm\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385530 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-config-volume\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385570 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-mountpoint-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385588 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpjv\" (UniqueName: \"kubernetes.io/projected/4db34d31-e1cc-4afd-afb4-b0a5a535053d-kube-api-access-mrpjv\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.385610 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkk7v\" (UniqueName: \"kubernetes.io/projected/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-kube-api-access-zkk7v\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.385821 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:07.885808215 +0000 UTC m=+38.711718710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386083 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-registration-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386385 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-socket-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386522 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-csi-data-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386546 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-mountpoint-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386582 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c0081e2b-bec4-458a-93cb-a6d580aa9558-plugins-dir\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.386917 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-config-volume\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.389980 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-metrics-tls\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.392804 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4db34d31-e1cc-4afd-afb4-b0a5a535053d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.393036 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.400505 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dr4\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.406281 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.415561 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmr6h\" (UniqueName: \"kubernetes.io/projected/b567a1d4-55e9-4da3-b60e-585a6f7bcbf1-kube-api-access-lmr6h\") pod \"olm-operator-6b444d44fb-jhdck\" (UID: \"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.457876 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkk7v\" (UniqueName: \"kubernetes.io/projected/aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8-kube-api-access-zkk7v\") pod \"dns-default-9d4hw\" (UID: \"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8\") " pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.478476 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2fm\" (UniqueName: \"kubernetes.io/projected/c0081e2b-bec4-458a-93cb-a6d580aa9558-kube-api-access-9x2fm\") pod \"csi-hostpathplugin-96pmp\" (UID: \"c0081e2b-bec4-458a-93cb-a6d580aa9558\") " pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.516337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.516845 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.016828936 +0000 UTC m=+38.842739441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.534811 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpjv\" (UniqueName: \"kubernetes.io/projected/4db34d31-e1cc-4afd-afb4-b0a5a535053d-kube-api-access-mrpjv\") pod \"multus-admission-controller-857f4d67dd-6p99f\" (UID: \"4db34d31-e1cc-4afd-afb4-b0a5a535053d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.540358 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.618804 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.619372 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.119343515 +0000 UTC m=+38.945254020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.625408 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.628337 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:07 crc kubenswrapper[4681]: W0122 09:04:07.655680 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74aff32c_9835_440b_9961_5fbcada6c96b.slice/crio-1ef235cdb2d134d2378b25342c613b884fc2939af4b5391ef9ace177790a6b4e WatchSource:0}: Error finding container 1ef235cdb2d134d2378b25342c613b884fc2939af4b5391ef9ace177790a6b4e: Status 404 returned error can't find the container with id 1ef235cdb2d134d2378b25342c613b884fc2939af4b5391ef9ace177790a6b4e Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.701535 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.709621 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.720557 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.720655 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.721256 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.221241807 +0000 UTC m=+39.047152312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.728032 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e7e003a-24ec-4f48-a156-a5ed6a3afd03-metrics-certs\") pod \"network-metrics-daemon-vjf2g\" (UID: \"2e7e003a-24ec-4f48-a156-a5ed6a3afd03\") " pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.746245 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjf2g" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.765120 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.778079 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-s5tkc"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.778131 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.803738 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" event={"ID":"e3bd3021-b5e7-4c2c-8152-6f0450cea681","Type":"ContainerStarted","Data":"c910b5bb75ecacc968612d71a2d33e815b28083b719d764393e78e0a821a15a0"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.807431 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" event={"ID":"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1","Type":"ContainerStarted","Data":"0a4f543c3042815d0149ebfcf5ef6c65f7fb6552f40c621e713cb2579f276f3a"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.809065 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" event={"ID":"7cd26dd1-2322-4cca-9d02-7d924ce912ee","Type":"ContainerStarted","Data":"67150699298c49f2b3f78f789a91e239aa6ce9e4de04cb7809ceb69b9c06b6c2"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.815559 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.819193 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" event={"ID":"c3917a9c-dd78-498f-96b3-36fd8d6421c6","Type":"ContainerStarted","Data":"f4b40f407de3816e613e98f5ef3fddc92c9d8ecb1ff650f447ec185213440fcf"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.820468 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.821436 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.821879 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.321862115 +0000 UTC m=+39.147772620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.822739 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" event={"ID":"0e2a0e30-6d84-4db6-bb01-3012041a2b84","Type":"ContainerStarted","Data":"abdb8b1de3d80cde7d2835c0477eb9e01e3d168c43d46d2b9a90661406b73a4b"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.826019 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" event={"ID":"47db6098-5a83-4d02-bec9-886b3dd01a4f","Type":"ContainerStarted","Data":"365d7940ad3c4440c18fdd6b9d4ba1a366a2339172eee119696f753f94179587"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.832568 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" event={"ID":"3566a8b9-503a-4c4b-a008-8bfeb5e38fa8","Type":"ContainerStarted","Data":"c76b29f1fda777202cc27eb9488ea402a9f9457ccec18c95c5140f811a9f1908"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.837781 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-64gjv" event={"ID":"3f828bb9-12a2-4148-999b-ab78f638d4b0","Type":"ContainerStarted","Data":"014574d6405766f07954771714fe97e1448230ecb2f2e636fa07ce915bdd16f0"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.842188 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-glh9f" event={"ID":"c26ba6fb-8b7a-4207-82ed-3b746c50e824","Type":"ContainerStarted","Data":"cc0b7f3d97b380b926d4d97da8b1e0d14f86935a5d49594d2c758dad6839af53"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.867685 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" event={"ID":"74aff32c-9835-440b-9961-5fbcada6c96b","Type":"ContainerStarted","Data":"1ef235cdb2d134d2378b25342c613b884fc2939af4b5391ef9ace177790a6b4e"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.881547 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" event={"ID":"64b9cc65-319e-48c9-9772-0abae151c1ba","Type":"ContainerDied","Data":"adfe733cc4d7f71f17a003135dcad6b39a66d750d937a9e1b3e56f8715a9f62e"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.881578 4681 generic.go:334] "Generic (PLEG): container finished" podID="64b9cc65-319e-48c9-9772-0abae151c1ba" containerID="adfe733cc4d7f71f17a003135dcad6b39a66d750d937a9e1b3e56f8715a9f62e" exitCode=0 Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.892793 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pfbg7"] Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.900293 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" event={"ID":"5ecda186-bc95-4f85-89cb-1c3fcc1354ce","Type":"ContainerStarted","Data":"6bc2bbd45fe99417f7c40c27bea8750ea0cf421f1d9cdbb8b36e35de94176299"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.909566 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" event={"ID":"5d28f6be-a6f5-4605-b071-ec453a08a7d7","Type":"ContainerStarted","Data":"2d4c87bddf78693c4ef3c7bc2645c217145a0b93a02999bfb5b155e62482d1e6"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.910544 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" event={"ID":"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa","Type":"ContainerStarted","Data":"0dbf09e6aa31d011cc1d88209ee2a991d9e122e23aa0bb85939b59d06af96257"} Jan 22 09:04:07 crc kubenswrapper[4681]: I0122 09:04:07.923236 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:07 crc kubenswrapper[4681]: E0122 09:04:07.925131 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.425118473 +0000 UTC m=+39.251028978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.005078 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz"] Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.027812 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c2p8w"] Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.027921 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.028092 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.528028962 +0000 UTC m=+39.353939467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.032844 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.034951 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.534933024 +0000 UTC m=+39.360843529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.133526 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.133712 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.633682963 +0000 UTC m=+39.459593468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.133788 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.134133 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.634124285 +0000 UTC m=+39.460034890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: W0122 09:04:08.153579 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1394b89_92be_4238_8c46_0be31f9ba572.slice/crio-67efd158d76df5bd76e8457c1967c97dcdae2311910505cd6a6f294cfc47fc0c WatchSource:0}: Error finding container 67efd158d76df5bd76e8457c1967c97dcdae2311910505cd6a6f294cfc47fc0c: Status 404 returned error can't find the container with id 67efd158d76df5bd76e8457c1967c97dcdae2311910505cd6a6f294cfc47fc0c Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.189509 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd"] Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.234890 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.235430 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.73539191 +0000 UTC m=+39.561302405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.235619 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.236007 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.735988596 +0000 UTC m=+39.561899111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.342863 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.343254 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.843191998 +0000 UTC m=+39.669102503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.344565 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.345850 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.845827018 +0000 UTC m=+39.671737523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.394788 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.396670 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44"] Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.448503 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.448713 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.948682105 +0000 UTC m=+39.774592610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.448928 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.449499 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:08.949489097 +0000 UTC m=+39.775399602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.550995 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.551543 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.051517582 +0000 UTC m=+39.877428087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.652580 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.653108 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.153090186 +0000 UTC m=+39.979000691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.753837 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.754192 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.254170906 +0000 UTC m=+40.080081411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.754371 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.754699 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.25468692 +0000 UTC m=+40.080597435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.855950 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.856544 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.35650686 +0000 UTC m=+40.182417405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.955885 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fqc8k" event={"ID":"293c5501-59eb-41ae-b7c4-19b7cffb2b6b","Type":"ContainerStarted","Data":"5d71e6331373a4ab4b47ad3ece14921f869fd2be982f95fd82ef6659bf10cfe5"} Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.956329 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fqc8k" event={"ID":"293c5501-59eb-41ae-b7c4-19b7cffb2b6b","Type":"ContainerStarted","Data":"89b229bdfadae6e88f8e0761ca391919e140c2c29c0dc194d0093d2671e33bdc"} Jan 22 09:04:08 crc kubenswrapper[4681]: I0122 09:04:08.963744 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:08 crc kubenswrapper[4681]: E0122 09:04:08.964631 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.464611666 +0000 UTC m=+40.290522171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.065341 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.067791 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.567772372 +0000 UTC m=+40.393682877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.073296 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fvrbp" event={"ID":"5c1adb5a-0fb0-4a16-a42e-0b79ec825963","Type":"ContainerStarted","Data":"0ea738c3a93968a540fde5104ecef5a5ac75bf35c98e25780c5ef2e36505afb6"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.074369 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.075581 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-fvrbp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.075627 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fvrbp" podUID="5c1adb5a-0fb0-4a16-a42e-0b79ec825963" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.082672 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fqc8k" podStartSLOduration=5.082645894 podStartE2EDuration="5.082645894s" podCreationTimestamp="2026-01-22 09:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.081317139 +0000 UTC m=+39.907227654" watchObservedRunningTime="2026-01-22 09:04:09.082645894 +0000 UTC m=+39.908556399" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.088347 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ktf4x" podStartSLOduration=18.088301774 podStartE2EDuration="18.088301774s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.052664432 +0000 UTC m=+39.878574937" watchObservedRunningTime="2026-01-22 09:04:09.088301774 +0000 UTC m=+39.914212279" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.092686 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" event={"ID":"63433617-6d4f-45f6-9b31-51313dbb4985","Type":"ContainerStarted","Data":"c450f0f7627820de6e5edc264891d8a7ce58fd92c524509ad473418d8d8c2ca1"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.119060 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" event={"ID":"363db0df-34ba-45e7-abce-c19cd7cc4d24","Type":"ContainerStarted","Data":"1a86951920ef9b708203143df067d54e24f11282f719bfe903508875f1464141"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.126194 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" event={"ID":"2d3823b7-bc6c-4206-9a4a-347488ed67ba","Type":"ContainerStarted","Data":"ab74c538a00e8dfb90d05028beac76c26b793d54fdeee937ccab6bd9fe387ea5"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.142681 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" event={"ID":"a6477b5b-15c2-458c-ba42-4be5ca90acb6","Type":"ContainerStarted","Data":"af1a5b2fb0277169b876b4b4115f5f31852675895b3db4ce35395b55e194054c"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.152902 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-glh9f" podStartSLOduration=18.15288231 podStartE2EDuration="18.15288231s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.123888284 +0000 UTC m=+39.949798799" watchObservedRunningTime="2026-01-22 09:04:09.15288231 +0000 UTC m=+39.978792815" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.155758 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" event={"ID":"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c","Type":"ContainerStarted","Data":"fb37ac4c32870ec559225c9092ddd268cc2ce9608a5796940ad2c08d787c7e1e"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.161274 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" event={"ID":"47db6098-5a83-4d02-bec9-886b3dd01a4f","Type":"ContainerStarted","Data":"5c21c99f8daecc6e2be50252b098f47c71d9e950113ee60a74636005c6191572"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.162281 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.167005 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.167916 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.667893287 +0000 UTC m=+40.493803792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.180947 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.202117 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" event={"ID":"a2419737-5527-429d-a3b9-213a60b502cb","Type":"ContainerStarted","Data":"2c7eb834e69beb13e0d6b8a1210f3e7d17dbc566b53ca2ecef7aa13a33eac6f4"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.215015 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fvrbp" podStartSLOduration=18.214998171 podStartE2EDuration="18.214998171s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.184543787 +0000 UTC m=+40.010454292" watchObservedRunningTime="2026-01-22 09:04:09.214998171 +0000 UTC m=+40.040908666" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.215391 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" podStartSLOduration=18.215386841 podStartE2EDuration="18.215386841s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.213709117 +0000 UTC m=+40.039619622" watchObservedRunningTime="2026-01-22 09:04:09.215386841 +0000 UTC m=+40.041297346" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.231700 4681 generic.go:334] "Generic (PLEG): container finished" podID="5ecda186-bc95-4f85-89cb-1c3fcc1354ce" containerID="aac017884fde7fc397ae9b4edc018f703b8c849df299f3d417cdf964b0392704" exitCode=0 Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.231801 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" event={"ID":"5ecda186-bc95-4f85-89cb-1c3fcc1354ce","Type":"ContainerDied","Data":"aac017884fde7fc397ae9b4edc018f703b8c849df299f3d417cdf964b0392704"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.235667 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" event={"ID":"4b49c3e1-6d6d-498b-81a7-f40174f7f7c1","Type":"ContainerStarted","Data":"d38d5694b97ef43da43b33651dcaca64979c75058b3cefdd552dc601b25f8a9e"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.257799 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" event={"ID":"49d5a414-d020-4687-8d32-3141061d0c80","Type":"ContainerStarted","Data":"588d0a2f1de9547a8d6182a98c84dea656ac73f2f5249fef7f4c032ca39ee5f6"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.266551 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-64gjv" event={"ID":"3f828bb9-12a2-4148-999b-ab78f638d4b0","Type":"ContainerStarted","Data":"16570f66c74de653f6e5fd65f26babab273583279423a39913d220ebb1d478dd"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.267034 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.267740 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.268898 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.768882175 +0000 UTC m=+40.594792680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.278524 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" event={"ID":"e3bd3021-b5e7-4c2c-8152-6f0450cea681","Type":"ContainerStarted","Data":"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.280029 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.280367 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.287002 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qjngv"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.287997 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wcld5" event={"ID":"47f80193-b6ae-4185-aa1f-320ba7f8dce9","Type":"ContainerStarted","Data":"5dfcb1badea3d81ec12b1702a1cba037acdc654f7987dbc570fb0e28afb7666f"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.306868 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" event={"ID":"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa","Type":"ContainerStarted","Data":"3a74bece50669a85df00d33ad6e91f89983b8380f0c19b632d0639540069283c"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.308350 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" event={"ID":"0a017a02-2d7a-442d-befc-943f6dc038cd","Type":"ContainerStarted","Data":"e172d4b7a143ea24155c32468260c190b89895fac5fc9c130cec313bedb2d110"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.308659 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.309428 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" event={"ID":"74aff32c-9835-440b-9961-5fbcada6c96b","Type":"ContainerStarted","Data":"f8ebffbb1e8701da4912a9db18ad97f9c906e710999ebd32e2688df60476f2d0"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.322462 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" podStartSLOduration=18.32244658 podStartE2EDuration="18.32244658s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.320798316 +0000 UTC m=+40.146708821" watchObservedRunningTime="2026-01-22 09:04:09.32244658 +0000 UTC m=+40.148357085" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.322698 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vwwtm" podStartSLOduration=18.322694897 podStartE2EDuration="18.322694897s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.289330025 +0000 UTC m=+40.115240540" watchObservedRunningTime="2026-01-22 09:04:09.322694897 +0000 UTC m=+40.148605402" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.332196 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" event={"ID":"5334d917-0f71-4e93-ae7a-2a169f3b7a34","Type":"ContainerStarted","Data":"c67140e54c0963dfb66ef7efcb88eafb2940484fe07d8026423e8cf1b2f77536"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.346170 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9d4hw"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.364095 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" event={"ID":"6d75b145-9547-49b4-9aea-652ea33cb371","Type":"ContainerStarted","Data":"36d04915ce1a09f8f9bf526a4431e87e9f4fa9095b3b505c1a3d421d27e94934"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.370591 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.380091 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.880071782 +0000 UTC m=+40.705982287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.410181 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-64gjv" podStartSLOduration=18.410157317 podStartE2EDuration="18.410157317s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.360032973 +0000 UTC m=+40.185943478" watchObservedRunningTime="2026-01-22 09:04:09.410157317 +0000 UTC m=+40.236067822" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.418062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" event={"ID":"c3917a9c-dd78-498f-96b3-36fd8d6421c6","Type":"ContainerStarted","Data":"ef87d5705e4ccc3a8854bf00b42dfae1c41cefd33dbbded0fcac43ef56b13c65"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.444021 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" event={"ID":"5d28f6be-a6f5-4605-b071-ec453a08a7d7","Type":"ContainerStarted","Data":"460c3bc2df004bbad0caeaeeea6612cec7748d59f66183ac01671fcd4bb381d7"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.458618 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gfd8p" podStartSLOduration=17.458588206 podStartE2EDuration="17.458588206s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.446950688 +0000 UTC m=+40.272861193" watchObservedRunningTime="2026-01-22 09:04:09.458588206 +0000 UTC m=+40.284498711" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.476308 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.477714 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:09.977694701 +0000 UTC m=+40.803605206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.495548 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" event={"ID":"0e2a0e30-6d84-4db6-bb01-3012041a2b84","Type":"ContainerStarted","Data":"a4b033b29f6999d0e0521f0d0bf1afcf0a9dd511990fa3c8625815d602b444dd"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.495623 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" event={"ID":"a1394b89-92be-4238-8c46-0be31f9ba572","Type":"ContainerStarted","Data":"67efd158d76df5bd76e8457c1967c97dcdae2311910505cd6a6f294cfc47fc0c"} Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.506739 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4rq95" podStartSLOduration=18.506707847 podStartE2EDuration="18.506707847s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.490112639 +0000 UTC m=+40.316023164" watchObservedRunningTime="2026-01-22 09:04:09.506707847 +0000 UTC m=+40.332618352" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.506897 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.516924 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5rpfx"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.519134 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.520376 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-btm7s" podStartSLOduration=18.520362598 podStartE2EDuration="18.520362598s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:09.517999675 +0000 UTC m=+40.343910180" watchObservedRunningTime="2026-01-22 09:04:09.520362598 +0000 UTC m=+40.346273103" Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.524327 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.546962 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.553187 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.560255 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.584246 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.587189 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.087168843 +0000 UTC m=+40.913079348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: W0122 09:04:09.588230 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13879fd8_0486_46df_8a5d_ef6c81d712fa.slice/crio-554e52ec3040562062d944bcf2d33b5abfde6ee40296b5e67526b54c28d39dcd WatchSource:0}: Error finding container 554e52ec3040562062d944bcf2d33b5abfde6ee40296b5e67526b54c28d39dcd: Status 404 returned error can't find the container with id 554e52ec3040562062d944bcf2d33b5abfde6ee40296b5e67526b54c28d39dcd Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.609472 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2"] Jan 22 09:04:09 crc kubenswrapper[4681]: W0122 09:04:09.667963 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod163e41b7_e7e1_4f98_80df_ea25cca890e5.slice/crio-7c756ddc9697e893aff19a35a27ff44690c9925c4bef9d0de0caf82f42eda63d WatchSource:0}: Error finding container 7c756ddc9697e893aff19a35a27ff44690c9925c4bef9d0de0caf82f42eda63d: Status 404 returned error can't find the container with id 7c756ddc9697e893aff19a35a27ff44690c9925c4bef9d0de0caf82f42eda63d Jan 22 09:04:09 crc kubenswrapper[4681]: W0122 09:04:09.680670 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod559df17a_f729_4647_9893_64aa96331ed6.slice/crio-928ed4fe50606e896732d7fe6e8060e9688d8a65afec89fe9a31975d8efd5c6b WatchSource:0}: Error finding container 928ed4fe50606e896732d7fe6e8060e9688d8a65afec89fe9a31975d8efd5c6b: Status 404 returned error can't find the container with id 928ed4fe50606e896732d7fe6e8060e9688d8a65afec89fe9a31975d8efd5c6b Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.688807 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.689099 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.189072455 +0000 UTC m=+41.014982960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.689605 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.690707 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.190697928 +0000 UTC m=+41.016608433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.708688 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96pmp"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.726775 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6p99f"] Jan 22 09:04:09 crc kubenswrapper[4681]: W0122 09:04:09.729052 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb567a1d4_55e9_4da3_b60e_585a6f7bcbf1.slice/crio-8addb91b06c3461b93ac1ad2654733d15c06d884ab87d01917306f2b775174dd WatchSource:0}: Error finding container 8addb91b06c3461b93ac1ad2654733d15c06d884ab87d01917306f2b775174dd: Status 404 returned error can't find the container with id 8addb91b06c3461b93ac1ad2654733d15c06d884ab87d01917306f2b775174dd Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.754158 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.765133 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjf2g"] Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.792187 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.792564 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.292549049 +0000 UTC m=+41.118459554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.893579 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.893966 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.393953528 +0000 UTC m=+41.219864023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:09 crc kubenswrapper[4681]: I0122 09:04:09.995210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:09 crc kubenswrapper[4681]: E0122 09:04:09.996022 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.496002164 +0000 UTC m=+41.321912669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.054931 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod363db0df_34ba_45e7_abce_c19cd7cc4d24.slice/crio-dd830d6a39b70fb7eaab7fc5c117f2420b96aeb42fc943452822c6ba8f8a1923.scope\": RecentStats: unable to find data in memory cache]" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.097599 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.098018 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.598003809 +0000 UTC m=+41.423914314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.139657 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-64gjv" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.200321 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.200899 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.700877737 +0000 UTC m=+41.526788242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.303397 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.303988 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.80396637 +0000 UTC m=+41.629876865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.407582 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.407790 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.907762292 +0000 UTC m=+41.733672797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.408444 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.408891 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:10.908873292 +0000 UTC m=+41.734783797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.419326 4681 csr.go:261] certificate signing request csr-7d55f is approved, waiting to be issued Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.424567 4681 csr.go:257] certificate signing request csr-7d55f is issued Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.510187 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.510582 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.010561968 +0000 UTC m=+41.836472473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.554213 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" event={"ID":"6d75b145-9547-49b4-9aea-652ea33cb371","Type":"ContainerStarted","Data":"969ce50d2f970fec1c3be7a46c6510eb0fe8554ad7fce758265e3693c1bc7dc9"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.555018 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.556450 4681 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qhf7x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.556500 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.578554 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" event={"ID":"163e41b7-e7e1-4f98-80df-ea25cca890e5","Type":"ContainerStarted","Data":"7c756ddc9697e893aff19a35a27ff44690c9925c4bef9d0de0caf82f42eda63d"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.599196 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" event={"ID":"4db34d31-e1cc-4afd-afb4-b0a5a535053d","Type":"ContainerStarted","Data":"d1528543af221f4f8927803a2adbf4edeae5d4c132acf22c97d8b64024e04372"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.610915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" event={"ID":"63433617-6d4f-45f6-9b31-51313dbb4985","Type":"ContainerStarted","Data":"0195cee22a14cb8ffed92281eaf24405ed3786fbc2fed801cda82b392f4995a8"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.614522 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.615748 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.115728647 +0000 UTC m=+41.941639152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.624174 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4hw" event={"ID":"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8","Type":"ContainerStarted","Data":"69d043c08f7ee9d5855440b92334191566f5551a5b06e3927a6136d1b1976374"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.630079 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" event={"ID":"5ecda186-bc95-4f85-89cb-1c3fcc1354ce","Type":"ContainerStarted","Data":"b6f880a45b3a2bf8e26688bcd3a32edb865c184b33aacb6858d1bd0b64ccc242"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.631212 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.642442 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" podStartSLOduration=18.642427162 podStartE2EDuration="18.642427162s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.590743717 +0000 UTC m=+41.416654222" watchObservedRunningTime="2026-01-22 09:04:10.642427162 +0000 UTC m=+41.468337667" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.643645 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgbd" podStartSLOduration=18.643641204 podStartE2EDuration="18.643641204s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.640957843 +0000 UTC m=+41.466868338" watchObservedRunningTime="2026-01-22 09:04:10.643641204 +0000 UTC m=+41.469551709" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.652632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" event={"ID":"49d5a414-d020-4687-8d32-3141061d0c80","Type":"ContainerStarted","Data":"799733f53898ac01cc6c3e14397dff09cf5248708aa9bb35af3da66b2bc7b414"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.659084 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" event={"ID":"b3515bca-636e-4f35-b3d5-3897757ec083","Type":"ContainerStarted","Data":"b4e4acd753ee2b146fabd388be2c62c9e329f50413e87fb35c8ab2753ee95c64"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.659145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" event={"ID":"b3515bca-636e-4f35-b3d5-3897757ec083","Type":"ContainerStarted","Data":"b536ae8f2071f88550479139a35bcb1217abc58f0fe5225bc9a99f3491eb4603"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.662262 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.667276 4681 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qxrpc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.667334 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" podUID="b3515bca-636e-4f35-b3d5-3897757ec083" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.667681 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" event={"ID":"ac71e92a-9043-4004-b570-a0ce86bf2e76","Type":"ContainerStarted","Data":"c4c4a04db0c0662d5b611675cbcd64d1467352faa0600b136523111a0fb7a0ec"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.680196 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" event={"ID":"793ce360-5662-4226-9eb1-d11580d41655","Type":"ContainerStarted","Data":"8a97ae0326bde0492f6bbf44b70032ca47d488c07b96fc764e3103bd12ee4117"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.704625 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" podStartSLOduration=19.704600025 podStartE2EDuration="19.704600025s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.704010669 +0000 UTC m=+41.529921174" watchObservedRunningTime="2026-01-22 09:04:10.704600025 +0000 UTC m=+41.530510530" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.705555 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" event={"ID":"7cd26dd1-2322-4cca-9d02-7d924ce912ee","Type":"ContainerStarted","Data":"7954237a5190d00ab2ea56bb684aaed395b5e0dcf85b0efdfcfd410e91a80543"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.713221 4681 generic.go:334] "Generic (PLEG): container finished" podID="363db0df-34ba-45e7-abce-c19cd7cc4d24" containerID="dd830d6a39b70fb7eaab7fc5c117f2420b96aeb42fc943452822c6ba8f8a1923" exitCode=0 Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.713507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" event={"ID":"363db0df-34ba-45e7-abce-c19cd7cc4d24","Type":"ContainerDied","Data":"dd830d6a39b70fb7eaab7fc5c117f2420b96aeb42fc943452822c6ba8f8a1923"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.715567 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.715797 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5rpfx" event={"ID":"559df17a-f729-4647-9893-64aa96331ed6","Type":"ContainerStarted","Data":"928ed4fe50606e896732d7fe6e8060e9688d8a65afec89fe9a31975d8efd5c6b"} Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.715850 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.215810791 +0000 UTC m=+42.041721296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.716875 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.718444 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.21843273 +0000 UTC m=+42.044343445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.732543 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" podStartSLOduration=18.732522972 podStartE2EDuration="18.732522972s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.729683697 +0000 UTC m=+41.555594202" watchObservedRunningTime="2026-01-22 09:04:10.732522972 +0000 UTC m=+41.558433477" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.753510 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" event={"ID":"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c","Type":"ContainerStarted","Data":"67523b472eff08d46515826d753e0bd9bdfc0c06ed0d3285cd22b291dedcdf2f"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.753743 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.755975 4681 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ckctb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.756015 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.757308 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" event={"ID":"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b","Type":"ContainerStarted","Data":"426d324ac1ee463b5577e7d1ae1e7817f9c6c3c4d6666a1730f34efd7cacdfb4"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.758677 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjf2g" event={"ID":"2e7e003a-24ec-4f48-a156-a5ed6a3afd03","Type":"ContainerStarted","Data":"25d8f0db58d6f2d5cee999d3da1e53fb9bb0c47e53a2bc190e3c35d7d9dc0fed"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.761274 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-splh5" podStartSLOduration=18.761263052 podStartE2EDuration="18.761263052s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.759951117 +0000 UTC m=+41.585861622" watchObservedRunningTime="2026-01-22 09:04:10.761263052 +0000 UTC m=+41.587173557" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.770563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" event={"ID":"a2419737-5527-429d-a3b9-213a60b502cb","Type":"ContainerStarted","Data":"3a4e6f67e8f2cdfedc06900c7d091c72cc984157ae9c8174825704ca0bf36c47"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.771276 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.803320 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wcld5" event={"ID":"47f80193-b6ae-4185-aa1f-320ba7f8dce9","Type":"ContainerStarted","Data":"59aba4c6fd214b33f20402300d6724491666f33b6f39913178e7dd3ba761742d"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.810994 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" event={"ID":"c0081e2b-bec4-458a-93cb-a6d580aa9558","Type":"ContainerStarted","Data":"558a321a2a445933178550afde44decc4dcb7dc32831581ff0c66ac862c1da7d"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.818932 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.819892 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.31986151 +0000 UTC m=+42.145772015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.857424 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" event={"ID":"0a017a02-2d7a-442d-befc-943f6dc038cd","Type":"ContainerStarted","Data":"7e28068e776d4080691a2e30b21c6e219543f1fba907da6a21fa7ec3eb52d3f3"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.883307 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" event={"ID":"2d3823b7-bc6c-4206-9a4a-347488ed67ba","Type":"ContainerStarted","Data":"b4b49c0d138a0d98949ac6ae2002cba936352c38f14545f1b15c1726dff131f2"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.893551 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podStartSLOduration=18.893526426 podStartE2EDuration="18.893526426s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.892277613 +0000 UTC m=+41.718188108" watchObservedRunningTime="2026-01-22 09:04:10.893526426 +0000 UTC m=+41.719436921" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.916646 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" event={"ID":"5d28f6be-a6f5-4605-b071-ec453a08a7d7","Type":"ContainerStarted","Data":"7c7adf56b82aac789375a8408afa1612ce01e86fc6aae6ff3c975e171c31dcfd"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.922829 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:10 crc kubenswrapper[4681]: E0122 09:04:10.924521 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.424504844 +0000 UTC m=+42.250415339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.933238 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" event={"ID":"2b7db3ea-7250-4e34-b6a7-7fb14b392441","Type":"ContainerStarted","Data":"5065aa5419cd4236be6472ede80cd7eb87f81e3e69ac2f2bbdfefc8d66d3639b"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.933298 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" event={"ID":"2b7db3ea-7250-4e34-b6a7-7fb14b392441","Type":"ContainerStarted","Data":"c7854157b65c666357181bf1e5a4810b01c2b5d3c0f3a654d93dc86109ed3a2d"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.935242 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" event={"ID":"5334d917-0f71-4e93-ae7a-2a169f3b7a34","Type":"ContainerStarted","Data":"74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65"} Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.935846 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:10 crc kubenswrapper[4681]: I0122 09:04:10.983105 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" event={"ID":"64b9cc65-319e-48c9-9772-0abae151c1ba","Type":"ContainerStarted","Data":"152ebc5a591d3a421183c63cb680671cc1c4b7319486512caf5d671406b7eccf"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.003441 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" podStartSLOduration=19.003421429 podStartE2EDuration="19.003421429s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.933475901 +0000 UTC m=+41.759386406" watchObservedRunningTime="2026-01-22 09:04:11.003421429 +0000 UTC m=+41.829331934" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.003579 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wcld5" podStartSLOduration=19.003575433 podStartE2EDuration="19.003575433s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:10.98186787 +0000 UTC m=+41.807778375" watchObservedRunningTime="2026-01-22 09:04:11.003575433 +0000 UTC m=+41.829485938" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.026859 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.027372 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.527349062 +0000 UTC m=+42.353259567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.035776 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.037794 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.537779427 +0000 UTC m=+42.363689932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.041031 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mtcwj" podStartSLOduration=19.041008382 podStartE2EDuration="19.041008382s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.040966001 +0000 UTC m=+41.866876506" watchObservedRunningTime="2026-01-22 09:04:11.041008382 +0000 UTC m=+41.866918877" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.065566 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podStartSLOduration=7.065548891 podStartE2EDuration="7.065548891s" podCreationTimestamp="2026-01-22 09:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.065108519 +0000 UTC m=+41.891019034" watchObservedRunningTime="2026-01-22 09:04:11.065548891 +0000 UTC m=+41.891459396" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.095215 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qjngv" podStartSLOduration=19.095008859 podStartE2EDuration="19.095008859s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.092053121 +0000 UTC m=+41.917963626" watchObservedRunningTime="2026-01-22 09:04:11.095008859 +0000 UTC m=+41.920919364" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.126454 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" event={"ID":"2e3ad21c-5666-479f-9ee9-0ccdf0f2c0fa","Type":"ContainerStarted","Data":"f50d678440674290b7248f0c89f799524d3a6759d76a3b9200ed4407529db373"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.126566 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.135042 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" event={"ID":"630a0503-a218-4ac5-b1db-01b76a08f5c1","Type":"ContainerStarted","Data":"ebabdeea40776306f7fab0142bb0db77b5d01e97d9362cc2966131af275df1bb"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.136844 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.137406 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.637385949 +0000 UTC m=+42.463296444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.154632 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" event={"ID":"13879fd8-0486-46df-8a5d-ef6c81d712fa","Type":"ContainerStarted","Data":"554e52ec3040562062d944bcf2d33b5abfde6ee40296b5e67526b54c28d39dcd"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.207254 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" podStartSLOduration=19.207232264 podStartE2EDuration="19.207232264s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.166631191 +0000 UTC m=+41.992541696" watchObservedRunningTime="2026-01-22 09:04:11.207232264 +0000 UTC m=+42.033142759" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.235201 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.239127 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" event={"ID":"a1394b89-92be-4238-8c46-0be31f9ba572","Type":"ContainerStarted","Data":"5d92a0034d5dda04c1c71fc0fb351c89f8fe75c97b0888aab88dbae100e7339a"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.241275 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-klxjr" podStartSLOduration=20.241249813 podStartE2EDuration="20.241249813s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.207760868 +0000 UTC m=+42.033671373" watchObservedRunningTime="2026-01-22 09:04:11.241249813 +0000 UTC m=+42.067160318" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.241339 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.241700 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.741687164 +0000 UTC m=+42.567597669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.274031 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:11 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:11 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:11 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.274113 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.294584 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.295906 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.296468 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" podStartSLOduration=20.296446401 podStartE2EDuration="20.296446401s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.295210458 +0000 UTC m=+42.121120963" watchObservedRunningTime="2026-01-22 09:04:11.296446401 +0000 UTC m=+42.122356906" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.340573 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.345965 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.347517 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.84749326 +0000 UTC m=+42.673403765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.367123 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pfbg7" podStartSLOduration=20.367103248 podStartE2EDuration="20.367103248s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:11.360848362 +0000 UTC m=+42.186758877" watchObservedRunningTime="2026-01-22 09:04:11.367103248 +0000 UTC m=+42.193013763" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.381981 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" event={"ID":"f43902ae-4bee-4612-8e55-ca6ffc779ec0","Type":"ContainerStarted","Data":"c7218e688328c7401f46ae4da71c00214a3cc76a0cde8e6ea12c7ea9555b0de4"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.412590 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" event={"ID":"a6477b5b-15c2-458c-ba42-4be5ca90acb6","Type":"ContainerStarted","Data":"7c224850a66eb0841b87af223d52c6c8a55d95445770ea832391fe7ce6cb07eb"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.432996 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-22 08:59:10 +0000 UTC, rotation deadline is 2026-12-10 22:55:40.159014994 +0000 UTC Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.433030 4681 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7741h51m28.725988125s for next certificate rotation Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.449109 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.449824 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:11.949795992 +0000 UTC m=+42.775706667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.482944 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-fvrbp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.483310 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fvrbp" podUID="5c1adb5a-0fb0-4a16-a42e-0b79ec825963" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.555925 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.559563 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.059520851 +0000 UTC m=+42.885431366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.586650 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" event={"ID":"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1","Type":"ContainerStarted","Data":"8addb91b06c3461b93ac1ad2654733d15c06d884ab87d01917306f2b775174dd"} Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.665727 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.666540 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.166528248 +0000 UTC m=+42.992438753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.767799 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.768215 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.268199015 +0000 UTC m=+43.094109520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.772486 4681 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m5r66 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.772582 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" podUID="a2419737-5527-429d-a3b9-213a60b502cb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.870065 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.870415 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.370405055 +0000 UTC m=+43.196315560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:11 crc kubenswrapper[4681]: I0122 09:04:11.974451 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:11 crc kubenswrapper[4681]: E0122 09:04:11.975312 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.475276795 +0000 UTC m=+43.301187300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.042846 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-4xk8b"] Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.080129 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.080476 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.580464374 +0000 UTC m=+43.406374879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.184562 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.184840 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.684801521 +0000 UTC m=+43.510712026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.184976 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.185815 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.685805027 +0000 UTC m=+43.511715522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.249577 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:12 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:12 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:12 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.249993 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.287272 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.287711 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.787693849 +0000 UTC m=+43.613604354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.390694 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.391151 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.891135842 +0000 UTC m=+43.717046347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.491790 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.492684 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:12.992660165 +0000 UTC m=+43.818570670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.501613 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" event={"ID":"793ce360-5662-4226-9eb1-d11580d41655","Type":"ContainerStarted","Data":"1062fc51be7d9c6f1c8749b8e79358333a98c08e29ff3d75f2d0c1375786b65d"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.542613 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjf2g" event={"ID":"2e7e003a-24ec-4f48-a156-a5ed6a3afd03","Type":"ContainerStarted","Data":"f8631e8e4170b6e00ead033743688352469b02100ea8ec57ac5f14e2ff113009"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.542999 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pnhr5" podStartSLOduration=20.542973074 podStartE2EDuration="20.542973074s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.542583703 +0000 UTC m=+43.368494208" watchObservedRunningTime="2026-01-22 09:04:12.542973074 +0000 UTC m=+43.368883579" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.571845 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" event={"ID":"49d5a414-d020-4687-8d32-3141061d0c80","Type":"ContainerStarted","Data":"4de2ab6889e8a6132ff411b8e725e083c629aab66d336b1092405e369e910fb8"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.573060 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.595642 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.596991 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.096979371 +0000 UTC m=+43.922889876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.619380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" event={"ID":"a6477b5b-15c2-458c-ba42-4be5ca90acb6","Type":"ContainerStarted","Data":"5545c8ea77a327df0d169a46e9ab74387966cec7a6d34d365cb8f470ca8d6d9e"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.634543 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" podStartSLOduration=20.634523142 podStartE2EDuration="20.634523142s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.632186281 +0000 UTC m=+43.458096776" watchObservedRunningTime="2026-01-22 09:04:12.634523142 +0000 UTC m=+43.460433647" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.657931 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" event={"ID":"4db34d31-e1cc-4afd-afb4-b0a5a535053d","Type":"ContainerStarted","Data":"7c53782c142c355dbb1ae0debae972a03a791cadd4785989dea134e861707594"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.696837 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.697496 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.197375873 +0000 UTC m=+44.023286378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.697659 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.699700 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.199689154 +0000 UTC m=+44.025599659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.734768 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5rpfx" event={"ID":"559df17a-f729-4647-9893-64aa96331ed6","Type":"ContainerStarted","Data":"05b665a17483102c3f4c11fc350ffaada956fa35bb663a91958690a4b2d890d1"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.757893 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" event={"ID":"b567a1d4-55e9-4da3-b60e-585a6f7bcbf1","Type":"ContainerStarted","Data":"66b8b7c59ceaff2ab962876ca9e074eae4de124c3a8a8ba468235c0ceccb80e7"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.757951 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.780178 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" event={"ID":"ac71e92a-9043-4004-b570-a0ce86bf2e76","Type":"ContainerStarted","Data":"15170e10d6ddd5596226d3d95d206a5f98a16e9ed139c93be448f3799eedaf55"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.790056 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-s5tkc" podStartSLOduration=21.790026701 podStartE2EDuration="21.790026701s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.685297674 +0000 UTC m=+43.511208179" watchObservedRunningTime="2026-01-22 09:04:12.790026701 +0000 UTC m=+43.615937206" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.790777 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5rpfx" podStartSLOduration=8.79076924 podStartE2EDuration="8.79076924s" podCreationTimestamp="2026-01-22 09:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.79075426 +0000 UTC m=+43.616664755" watchObservedRunningTime="2026-01-22 09:04:12.79076924 +0000 UTC m=+43.616679745" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.815763 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.817814 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.317786024 +0000 UTC m=+44.143696529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.818477 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" event={"ID":"0a017a02-2d7a-442d-befc-943f6dc038cd","Type":"ContainerStarted","Data":"eb56b133013b946e82862eba6843018bba7c80b5012adf0167381d49dc651cd0"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.833273 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" event={"ID":"630a0503-a218-4ac5-b1db-01b76a08f5c1","Type":"ContainerStarted","Data":"950577f6efd0f52f9114d15046386d4fedee5e59db9af4df2dd946762bca948c"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.857723 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" event={"ID":"163e41b7-e7e1-4f98-80df-ea25cca890e5","Type":"ContainerStarted","Data":"61222bef6e8d6bcd2a3f09a250946ca65bb5f6ac8b469fa52b14d4da8affc967"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.857780 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" event={"ID":"163e41b7-e7e1-4f98-80df-ea25cca890e5","Type":"ContainerStarted","Data":"438af975d8371435ca7074dee7de8b1b6fe50a8dc31b140ede0a91cf50d3ee6b"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.884406 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.914458 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m2xbf" podStartSLOduration=20.914439518000002 podStartE2EDuration="20.914439518s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.858615383 +0000 UTC m=+43.684525888" watchObservedRunningTime="2026-01-22 09:04:12.914439518 +0000 UTC m=+43.740350023" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.914872 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jhdck" podStartSLOduration=20.914866199 podStartE2EDuration="20.914866199s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:12.913248336 +0000 UTC m=+43.739158841" watchObservedRunningTime="2026-01-22 09:04:12.914866199 +0000 UTC m=+43.740776704" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.925683 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.927469 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" event={"ID":"f43902ae-4bee-4612-8e55-ca6ffc779ec0","Type":"ContainerStarted","Data":"008d6094862b0026e2286605d168cdd1ea09eca666e4013a4e6263b9d32f0cf9"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.927521 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" event={"ID":"f43902ae-4bee-4612-8e55-ca6ffc779ec0","Type":"ContainerStarted","Data":"9734981cdad30964debfd913aece968d734f9106adcb1903d922b70fd3eed69f"} Jan 22 09:04:12 crc kubenswrapper[4681]: E0122 09:04:12.929030 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.429007593 +0000 UTC m=+44.254918308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.933212 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.934305 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.941935 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.942600 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.953397 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zknvr" event={"ID":"13879fd8-0486-46df-8a5d-ef6c81d712fa","Type":"ContainerStarted","Data":"46c3a410b2f0e7139fd23ad3b739974a5150e82399b204239b86159b18fbd079"} Jan 22 09:04:12 crc kubenswrapper[4681]: I0122 09:04:12.994465 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" event={"ID":"f3a235c2-8a4c-41c8-b2c2-97ddad58da8b","Type":"ContainerStarted","Data":"44aeb1cb5f0e6aedfb711cac375fcce50835e5353748665bdfca0ea20647d73f"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.026978 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.027396 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.027489 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lfs\" (UniqueName: \"kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.027582 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.027947 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" podStartSLOduration=22.027929715 podStartE2EDuration="22.027929715s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.026040435 +0000 UTC m=+43.851950940" watchObservedRunningTime="2026-01-22 09:04:13.027929715 +0000 UTC m=+43.853840220" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.028530 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.52850896 +0000 UTC m=+44.354419465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.054708 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" event={"ID":"363db0df-34ba-45e7-abce-c19cd7cc4d24","Type":"ContainerStarted","Data":"7b508acbd977c6794de95d932936e1c944b6a1dd44ac926b9ba072e6cec7f188"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.102520 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cqzzn" podStartSLOduration=22.102505325 podStartE2EDuration="22.102505325s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.10117251 +0000 UTC m=+43.927083015" watchObservedRunningTime="2026-01-22 09:04:13.102505325 +0000 UTC m=+43.928415830" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.107102 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.115458 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.118658 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4hw" event={"ID":"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8","Type":"ContainerStarted","Data":"55429612aa0ee186cf9b19c15f3e944851fc61c78c9388eb4ab1d00c4166b93f"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.118703 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4hw" event={"ID":"aa3aca35-a9f8-4e91-9d7d-7c0f94f139f8","Type":"ContainerStarted","Data":"984f3798635a9a1e9fae3f042eb289f2e1ad4d9f3712bef7d0f8384acf152460"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.119814 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:13 crc kubenswrapper[4681]: W0122 09:04:13.120596 4681 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.120624 4681 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.133765 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lfs\" (UniqueName: \"kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.133824 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.133870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.133906 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.134340 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.634329266 +0000 UTC m=+44.460239771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.134906 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.135060 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.167735 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" event={"ID":"2d3823b7-bc6c-4206-9a4a-347488ed67ba","Type":"ContainerStarted","Data":"ba6cfb46d5231acd7aa4d6ac824d9e9a4cbf51b22b4f0258d7e7d7b8b8e3bae1"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.167807 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.193579 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" event={"ID":"c0081e2b-bec4-458a-93cb-a6d580aa9558","Type":"ContainerStarted","Data":"2a910a7ec2d50ffff2c13e94cc99e3f44f0994b0cd01e2188db10c738a34405b"} Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.202440 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-fvrbp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.202508 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fvrbp" podUID="5c1adb5a-0fb0-4a16-a42e-0b79ec825963" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.210815 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.221406 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gdz6n" podStartSLOduration=21.221381236 podStartE2EDuration="21.221381236s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.167808201 +0000 UTC m=+43.993718706" watchObservedRunningTime="2026-01-22 09:04:13.221381236 +0000 UTC m=+44.047291741" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.221533 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-crvnv" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.222865 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m5r66" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.223355 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.237678 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qxrpc" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.238787 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.238876 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lfs\" (UniqueName: \"kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs\") pod \"community-operators-sdgpn\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.239170 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.239274 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.739235768 +0000 UTC m=+44.565146273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.239485 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9q4\" (UniqueName: \"kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.240071 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.240180 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.245455 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:13 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:13 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:13 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.245534 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.249626 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.749609722 +0000 UTC m=+44.575520227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.261880 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-4dfqb" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.273323 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-r5cb5" podStartSLOduration=21.273302898 podStartE2EDuration="21.273302898s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.222691801 +0000 UTC m=+44.048602306" watchObservedRunningTime="2026-01-22 09:04:13.273302898 +0000 UTC m=+44.099213403" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.278135 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9d4hw" podStartSLOduration=9.278111485 podStartE2EDuration="9.278111485s" podCreationTimestamp="2026-01-22 09:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.272008453 +0000 UTC m=+44.097918958" watchObservedRunningTime="2026-01-22 09:04:13.278111485 +0000 UTC m=+44.104021980" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.331953 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.343240 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.355332 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.343540 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.843510393 +0000 UTC m=+44.669420928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.343396 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.363634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.363771 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.363952 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.364069 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9q4\" (UniqueName: \"kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.366094 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.866080509 +0000 UTC m=+44.691991014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.366766 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.369233 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.408146 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hhrz2" podStartSLOduration=21.408118489 podStartE2EDuration="21.408118489s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.348154885 +0000 UTC m=+44.174065390" watchObservedRunningTime="2026-01-22 09:04:13.408118489 +0000 UTC m=+44.234028994" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.410971 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.483541 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.483993 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.484053 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9q4\" (UniqueName: \"kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4\") pod \"certified-operators-h8d59\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.484097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2gw4\" (UniqueName: \"kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.484184 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.984164708 +0000 UTC m=+44.810075213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.484317 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.484355 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.484708 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:13.984701823 +0000 UTC m=+44.810612328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.570340 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.572206 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.572384 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.595579 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.595834 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.595919 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.595971 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gw4\" (UniqueName: \"kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.596520 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.096499476 +0000 UTC m=+44.922409981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.596918 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.597150 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.632080 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gw4\" (UniqueName: \"kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4\") pod \"community-operators-krnl8\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.696804 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.697038 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.697125 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw59x\" (UniqueName: \"kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.697222 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.697617 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.197603667 +0000 UTC m=+45.023514172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.802429 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.802849 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw59x\" (UniqueName: \"kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.802890 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.802979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.803747 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.804053 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.804083 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.30405396 +0000 UTC m=+45.129964605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.828971 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.896229 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw59x\" (UniqueName: \"kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x\") pod \"certified-operators-q8bfs\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.905037 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:13 crc kubenswrapper[4681]: E0122 09:04:13.905385 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.405370446 +0000 UTC m=+45.231280951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.938030 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fmngz" podStartSLOduration=21.938007259 podStartE2EDuration="21.938007259s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:13.911429596 +0000 UTC m=+44.737340101" watchObservedRunningTime="2026-01-22 09:04:13.938007259 +0000 UTC m=+44.763917764" Jan 22 09:04:13 crc kubenswrapper[4681]: I0122 09:04:13.987192 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.006228 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.006746 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.506723494 +0000 UTC m=+45.332633999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.108760 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.109326 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.609266843 +0000 UTC m=+45.435177348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.135344 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.135467 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.137446 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.209747 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.210172 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.710148548 +0000 UTC m=+45.536059053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.225861 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjf2g" event={"ID":"2e7e003a-24ec-4f48-a156-a5ed6a3afd03","Type":"ContainerStarted","Data":"df5eddfb5db736dee5f9c6f8f7e95b81b2a0591611140d34d0d45d8e254435e3"} Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.229369 4681 generic.go:334] "Generic (PLEG): container finished" podID="630a0503-a218-4ac5-b1db-01b76a08f5c1" containerID="950577f6efd0f52f9114d15046386d4fedee5e59db9af4df2dd946762bca948c" exitCode=0 Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.229428 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" event={"ID":"630a0503-a218-4ac5-b1db-01b76a08f5c1","Type":"ContainerDied","Data":"950577f6efd0f52f9114d15046386d4fedee5e59db9af4df2dd946762bca948c"} Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.240905 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" event={"ID":"4db34d31-e1cc-4afd-afb4-b0a5a535053d","Type":"ContainerStarted","Data":"281b56a1595caab7707dd385fcab56cb7e0951d0cbe18c97589045edb4451ddb"} Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.241678 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:14 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:14 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:14 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.241734 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.243974 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" event={"ID":"363db0df-34ba-45e7-abce-c19cd7cc4d24","Type":"ContainerStarted","Data":"55f2717e4a8ae90e406b239d01b23deebce1c52e61d4bb594c4625e0c1d7bb0a"} Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.248342 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vjf2g" podStartSLOduration=23.248331037 podStartE2EDuration="23.248331037s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:14.246660393 +0000 UTC m=+45.072570898" watchObservedRunningTime="2026-01-22 09:04:14.248331037 +0000 UTC m=+45.074241542" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.255181 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerStarted","Data":"3c79829b904c53898bc5fb946f2ad2cf413a3cb499b7f4a06b9173f9cbd13b8d"} Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.268472 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" gracePeriod=30 Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.284511 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6p99f" podStartSLOduration=22.284481202 podStartE2EDuration="22.284481202s" podCreationTimestamp="2026-01-22 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:14.267784031 +0000 UTC m=+45.093694536" watchObservedRunningTime="2026-01-22 09:04:14.284481202 +0000 UTC m=+45.110391727" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.310860 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.312871 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.812845582 +0000 UTC m=+45.638756087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.315471 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.317541 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" podStartSLOduration=23.317521085 podStartE2EDuration="23.317521085s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:14.310111709 +0000 UTC m=+45.136022214" watchObservedRunningTime="2026-01-22 09:04:14.317521085 +0000 UTC m=+45.143431590" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.411513 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.411641 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.911614751 +0000 UTC m=+45.737525256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.412058 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.416495 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:14.916470249 +0000 UTC m=+45.742380754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.509062 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.513977 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.514523 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.014501739 +0000 UTC m=+45.840412244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: W0122 09:04:14.515096 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e85041c_4d16_4a52_ae78_e3dc2d1e81f9.slice/crio-24b1f562282018edd88b319cbe1521587f1c76759f1108ec5b279208494ff181 WatchSource:0}: Error finding container 24b1f562282018edd88b319cbe1521587f1c76759f1108ec5b279208494ff181: Status 404 returned error can't find the container with id 24b1f562282018edd88b319cbe1521587f1c76759f1108ec5b279208494ff181 Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.584370 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:04:14 crc kubenswrapper[4681]: W0122 09:04:14.588949 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41006e44_10b2_443f_b477_8fd39e7b643e.slice/crio-ca71f854fbd73db44c60d595334cd0f8322b6f5871bfab7e709c43e1a564caea WatchSource:0}: Error finding container ca71f854fbd73db44c60d595334cd0f8322b6f5871bfab7e709c43e1a564caea: Status 404 returned error can't find the container with id ca71f854fbd73db44c60d595334cd0f8322b6f5871bfab7e709c43e1a564caea Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.615827 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.616142 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.116131954 +0000 UTC m=+45.942042459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.718793 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.719130 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.219084024 +0000 UTC m=+46.044994529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.719568 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.719953 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.219945837 +0000 UTC m=+46.045856342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.820493 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.821502 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.321456519 +0000 UTC m=+46.147367024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.865195 4681 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.875451 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.876477 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.880450 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.889472 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:04:14 crc kubenswrapper[4681]: I0122 09:04:14.923104 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:14 crc kubenswrapper[4681]: E0122 09:04:14.923481 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.423467144 +0000 UTC m=+46.249377649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.023824 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.024069 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds2zh\" (UniqueName: \"kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.024158 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.024184 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.024337 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.524320678 +0000 UTC m=+46.350231183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.125005 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.125053 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.125141 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds2zh\" (UniqueName: \"kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.125169 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.125556 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.625542263 +0000 UTC m=+46.451452768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.126215 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.126515 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.149822 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds2zh\" (UniqueName: \"kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh\") pod \"redhat-marketplace-44cjz\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.206793 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.226802 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.227248 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.727229539 +0000 UTC m=+46.553140044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.238695 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:15 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:15 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:15 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.238782 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.278872 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.279848 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.294229 4681 generic.go:334] "Generic (PLEG): container finished" podID="41006e44-10b2-443f-b477-8fd39e7b643e" containerID="ad0bd6b48fc4e99ab0567d9340e5374bdc68731c6e62e534a79cb7e4f3d5aedb" exitCode=0 Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.294356 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerDied","Data":"ad0bd6b48fc4e99ab0567d9340e5374bdc68731c6e62e534a79cb7e4f3d5aedb"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.294406 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerStarted","Data":"ca71f854fbd73db44c60d595334cd0f8322b6f5871bfab7e709c43e1a564caea"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.307956 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.316154 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.321744 4681 generic.go:334] "Generic (PLEG): container finished" podID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerID="797a8489ac4ea3f366b131ea65add3d23402e0f88662eb6b7a2a9c5d925a4124" exitCode=0 Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.321978 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerDied","Data":"797a8489ac4ea3f366b131ea65add3d23402e0f88662eb6b7a2a9c5d925a4124"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.322014 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerStarted","Data":"bee6efed67d1b30698b166ce690bbad2b502c7d021dfd68f3dd4534248fed688"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.324428 4681 generic.go:334] "Generic (PLEG): container finished" podID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerID="b4d6e8042e842996a16e1d6d91890befae5d24b50a5af4eb646e7772ea9047d0" exitCode=0 Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.324574 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerDied","Data":"b4d6e8042e842996a16e1d6d91890befae5d24b50a5af4eb646e7772ea9047d0"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.328114 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.328485 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.828473874 +0000 UTC m=+46.654384369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.337115 4681 generic.go:334] "Generic (PLEG): container finished" podID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerID="60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18" exitCode=0 Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.337214 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerDied","Data":"60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.337277 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerStarted","Data":"24b1f562282018edd88b319cbe1521587f1c76759f1108ec5b279208494ff181"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.361223 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" event={"ID":"c0081e2b-bec4-458a-93cb-a6d580aa9558","Type":"ContainerStarted","Data":"c717fc0fde9a795591e3bb692e00ee7b67cf6740f115a441495ed07fda7f2ad7"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.361314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" event={"ID":"c0081e2b-bec4-458a-93cb-a6d580aa9558","Type":"ContainerStarted","Data":"a93b0628c54e5484db8c0348ff1696545aa7057402f3995134f0d27d71dc7fbc"} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.429384 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.429886 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5j5\" (UniqueName: \"kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.429992 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.430018 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.430290 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-22 09:04:15.930251823 +0000 UTC m=+46.756162328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.532790 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.533316 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5j5\" (UniqueName: \"kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.533544 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.533617 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.535025 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.535204 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-22 09:04:16.035184475 +0000 UTC m=+46.861094970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-c287z" (UID: "49778c45-8be5-4610-8298-01e06333289c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.537445 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.563015 4681 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-22T09:04:14.865231705Z","Handler":null,"Name":""} Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.566916 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5j5\" (UniqueName: \"kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5\") pod \"redhat-marketplace-r4tkb\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.598239 4681 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.598330 4681 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.600648 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.636973 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.661745 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.664647 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.738719 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.775218 4681 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.775296 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.835170 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.877503 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 09:04:15 crc kubenswrapper[4681]: E0122 09:04:15.877800 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630a0503-a218-4ac5-b1db-01b76a08f5c1" containerName="collect-profiles" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.877816 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="630a0503-a218-4ac5-b1db-01b76a08f5c1" containerName="collect-profiles" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.877990 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="630a0503-a218-4ac5-b1db-01b76a08f5c1" containerName="collect-profiles" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.878487 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.881743 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.882119 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.912889 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.942419 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume\") pod \"630a0503-a218-4ac5-b1db-01b76a08f5c1\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.948011 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume\") pod \"630a0503-a218-4ac5-b1db-01b76a08f5c1\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.948107 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbqlc\" (UniqueName: \"kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc\") pod \"630a0503-a218-4ac5-b1db-01b76a08f5c1\" (UID: \"630a0503-a218-4ac5-b1db-01b76a08f5c1\") " Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.949960 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "630a0503-a218-4ac5-b1db-01b76a08f5c1" (UID: "630a0503-a218-4ac5-b1db-01b76a08f5c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.961920 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc" (OuterVolumeSpecName: "kube-api-access-fbqlc") pod "630a0503-a218-4ac5-b1db-01b76a08f5c1" (UID: "630a0503-a218-4ac5-b1db-01b76a08f5c1"). InnerVolumeSpecName "kube-api-access-fbqlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.964076 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "630a0503-a218-4ac5-b1db-01b76a08f5c1" (UID: "630a0503-a218-4ac5-b1db-01b76a08f5c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:04:15 crc kubenswrapper[4681]: I0122 09:04:15.968979 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-c287z\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.008794 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.049663 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.049761 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.049827 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/630a0503-a218-4ac5-b1db-01b76a08f5c1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.049841 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/630a0503-a218-4ac5-b1db-01b76a08f5c1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.049853 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbqlc\" (UniqueName: \"kubernetes.io/projected/630a0503-a218-4ac5-b1db-01b76a08f5c1-kube-api-access-fbqlc\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.152839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.152980 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.153036 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.166403 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.166777 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.171550 4681 patch_prober.go:28] interesting pod/console-f9d7485db-glh9f container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.171609 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-glh9f" podUID="c26ba6fb-8b7a-4207-82ed-3b746c50e824" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.183597 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.208893 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.236093 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:16 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:16 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:16 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.236144 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.238148 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.278776 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.283682 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.288816 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.289528 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-fvrbp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.289571 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fvrbp" podUID="5c1adb5a-0fb0-4a16-a42e-0b79ec825963" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.289705 4681 patch_prober.go:28] interesting pod/downloads-7954f5f757-fvrbp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.289796 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fvrbp" podUID="5c1adb5a-0fb0-4a16-a42e-0b79ec825963" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.291433 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.356643 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.356738 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4dj8\" (UniqueName: \"kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.356789 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.394345 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" event={"ID":"630a0503-a218-4ac5-b1db-01b76a08f5c1","Type":"ContainerDied","Data":"ebabdeea40776306f7fab0142bb0db77b5d01e97d9362cc2966131af275df1bb"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.394403 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebabdeea40776306f7fab0142bb0db77b5d01e97d9362cc2966131af275df1bb" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.394516 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.407072 4681 generic.go:334] "Generic (PLEG): container finished" podID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerID="48205f1210faf9932d845e7de7db8345f271e6c644654cfd465e37e13ae11829" exitCode=0 Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.407162 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerDied","Data":"48205f1210faf9932d845e7de7db8345f271e6c644654cfd465e37e13ae11829"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.407199 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerStarted","Data":"67c4ded127efe95c536c2570806e352cd01a3f2f98e48e6ad59eeeeb3faf8a3d"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.424241 4681 generic.go:334] "Generic (PLEG): container finished" podID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerID="92da1bf799b307618104f2c868856efa6120d74df5cbe2b9abb792056a6cc91f" exitCode=0 Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.424355 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerDied","Data":"92da1bf799b307618104f2c868856efa6120d74df5cbe2b9abb792056a6cc91f"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.424393 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerStarted","Data":"024e8074716c7c0bc480ab3b7b2b99d8b6f47cc6869a5a5a84b83b14231e6cb1"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.434555 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" event={"ID":"c0081e2b-bec4-458a-93cb-a6d580aa9558","Type":"ContainerStarted","Data":"5bdf1e8155e730ac0305068926f267690186d9bf11247088c58e3c0d6430f446"} Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.461740 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.461803 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4dj8\" (UniqueName: \"kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.461841 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.462422 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.462744 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.499458 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4dj8\" (UniqueName: \"kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8\") pod \"redhat-operators-7pz5k\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.510776 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-96pmp" podStartSLOduration=12.510758299999999 podStartE2EDuration="12.5107583s" podCreationTimestamp="2026-01-22 09:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:16.503107717 +0000 UTC m=+47.329018222" watchObservedRunningTime="2026-01-22 09:04:16.5107583 +0000 UTC m=+47.336668805" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.556098 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.623999 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.682364 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.683611 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.690697 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.690749 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.705963 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.712755 4681 patch_prober.go:28] interesting pod/apiserver-76f77b778f-c2p8w container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]log ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]etcd ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/generic-apiserver-start-informers ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/max-in-flight-filter ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 22 09:04:16 crc kubenswrapper[4681]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/project.openshift.io-projectcache ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-startinformers ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 22 09:04:16 crc kubenswrapper[4681]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 22 09:04:16 crc kubenswrapper[4681]: livez check failed Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.712835 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" podUID="363db0df-34ba-45e7-abce-c19cd7cc4d24" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.779821 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kz4f\" (UniqueName: \"kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.780127 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.780190 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.881613 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.882103 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.882218 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kz4f\" (UniqueName: \"kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.883755 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.884107 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.907622 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kz4f\" (UniqueName: \"kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f\") pod \"redhat-operators-5dskr\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:16 crc kubenswrapper[4681]: I0122 09:04:16.917104 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:04:16 crc kubenswrapper[4681]: W0122 09:04:16.961200 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49778c45_8be5_4610_8298_01e06333289c.slice/crio-ad285833ebead20ecc1fec6dea10f3d83342ccda240bd998f750dabcaa6f42c7 WatchSource:0}: Error finding container ad285833ebead20ecc1fec6dea10f3d83342ccda240bd998f750dabcaa6f42c7: Status 404 returned error can't find the container with id ad285833ebead20ecc1fec6dea10f3d83342ccda240bd998f750dabcaa6f42c7 Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.042221 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.045232 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:04:17 crc kubenswrapper[4681]: W0122 09:04:17.071202 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcafe2ee6_7f62_4d78_8e7e_de58c8506696.slice/crio-675b381a73b1a3b03a92dcaf8fc504071068e5d2cc78036b77fc7a06da7043e6 WatchSource:0}: Error finding container 675b381a73b1a3b03a92dcaf8fc504071068e5d2cc78036b77fc7a06da7043e6: Status 404 returned error can't find the container with id 675b381a73b1a3b03a92dcaf8fc504071068e5d2cc78036b77fc7a06da7043e6 Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.232475 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.237451 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:17 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:17 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:17 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.237509 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.269339 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:04:17 crc kubenswrapper[4681]: E0122 09:04:17.400152 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:17 crc kubenswrapper[4681]: E0122 09:04:17.429415 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:17 crc kubenswrapper[4681]: E0122 09:04:17.447190 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:17 crc kubenswrapper[4681]: E0122 09:04:17.447376 4681 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.486405 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.492591 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerStarted","Data":"0c678d250d47d3f62e9355e80cfea2ccebb1af2653f20870c82b5c6f84c2881e"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.501790 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4","Type":"ContainerStarted","Data":"0d995606887dd0ef5545110a5a1e62c6627f2c55b7bfaef1f0466bbec3d504dd"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.501856 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4","Type":"ContainerStarted","Data":"28b147c2147585b08191081de1d7f65f384d91f5b224b4ca374dfd00efe65912"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.506828 4681 generic.go:334] "Generic (PLEG): container finished" podID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerID="fb5f705aa809ba1e83dbbeaa46b90ceb50b0b856ab810ee9a5e6bbf337a9e2f2" exitCode=0 Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.506915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerDied","Data":"fb5f705aa809ba1e83dbbeaa46b90ceb50b0b856ab810ee9a5e6bbf337a9e2f2"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.506986 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerStarted","Data":"675b381a73b1a3b03a92dcaf8fc504071068e5d2cc78036b77fc7a06da7043e6"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.518048 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.51803196 podStartE2EDuration="2.51803196s" podCreationTimestamp="2026-01-22 09:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:17.517133667 +0000 UTC m=+48.343044172" watchObservedRunningTime="2026-01-22 09:04:17.51803196 +0000 UTC m=+48.343942465" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.529233 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" event={"ID":"49778c45-8be5-4610-8298-01e06333289c","Type":"ContainerStarted","Data":"b791dfc6198d96c5df5454c0919e251d4300ad70d6223522af5e088182c3ceb0"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.529296 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" event={"ID":"49778c45-8be5-4610-8298-01e06333289c","Type":"ContainerStarted","Data":"ad285833ebead20ecc1fec6dea10f3d83342ccda240bd998f750dabcaa6f42c7"} Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.529546 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:17 crc kubenswrapper[4681]: I0122 09:04:17.573191 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" podStartSLOduration=26.573165897 podStartE2EDuration="26.573165897s" podCreationTimestamp="2026-01-22 09:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:17.573087515 +0000 UTC m=+48.398998020" watchObservedRunningTime="2026-01-22 09:04:17.573165897 +0000 UTC m=+48.399076402" Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.235524 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:18 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:18 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:18 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.235952 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.539361 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4","Type":"ContainerDied","Data":"0d995606887dd0ef5545110a5a1e62c6627f2c55b7bfaef1f0466bbec3d504dd"} Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.539300 4681 generic.go:334] "Generic (PLEG): container finished" podID="8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" containerID="0d995606887dd0ef5545110a5a1e62c6627f2c55b7bfaef1f0466bbec3d504dd" exitCode=0 Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.544179 4681 generic.go:334] "Generic (PLEG): container finished" podID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerID="10133ec666c98534bf787f2a612f74b75a7e3d49b6de68349e6907ac608160a9" exitCode=0 Jan 22 09:04:18 crc kubenswrapper[4681]: I0122 09:04:18.544242 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerDied","Data":"10133ec666c98534bf787f2a612f74b75a7e3d49b6de68349e6907ac608160a9"} Jan 22 09:04:19 crc kubenswrapper[4681]: I0122 09:04:19.236415 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:19 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:19 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:19 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:19 crc kubenswrapper[4681]: I0122 09:04:19.236496 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:19 crc kubenswrapper[4681]: I0122 09:04:19.551907 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 22 09:04:19 crc kubenswrapper[4681]: I0122 09:04:19.566448 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 22 09:04:19 crc kubenswrapper[4681]: I0122 09:04:19.934706 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.034821 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access\") pod \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.034893 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir\") pod \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\" (UID: \"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4\") " Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.035418 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" (UID: "8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.057874 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" (UID: "8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.141468 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.141517 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.243699 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:20 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:20 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:20 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.243774 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.343594 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.343689 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.343718 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.343754 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.348353 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.348384 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.360394 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.366296 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.569310 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4","Type":"ContainerDied","Data":"28b147c2147585b08191081de1d7f65f384d91f5b224b4ca374dfd00efe65912"} Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.569380 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28b147c2147585b08191081de1d7f65f384d91f5b224b4ca374dfd00efe65912" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.569356 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.619478 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.627446 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:20 crc kubenswrapper[4681]: I0122 09:04:20.638944 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.064839 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.064818873 podStartE2EDuration="2.064818873s" podCreationTimestamp="2026-01-22 09:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:19.983336653 +0000 UTC m=+50.809247158" watchObservedRunningTime="2026-01-22 09:04:21.064818873 +0000 UTC m=+51.890729378" Jan 22 09:04:21 crc kubenswrapper[4681]: W0122 09:04:21.098117 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ae344b4db66a070577801bc10c88eba38789d8716316a2af6ef0a9e402274d33 WatchSource:0}: Error finding container ae344b4db66a070577801bc10c88eba38789d8716316a2af6ef0a9e402274d33: Status 404 returned error can't find the container with id ae344b4db66a070577801bc10c88eba38789d8716316a2af6ef0a9e402274d33 Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.238513 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:21 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:21 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:21 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.238601 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:21 crc kubenswrapper[4681]: W0122 09:04:21.295062 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9458fb2d075a95c4bcc103fe317671430e1c5f4c662595eef8ecce11e9ba8f97 WatchSource:0}: Error finding container 9458fb2d075a95c4bcc103fe317671430e1c5f4c662595eef8ecce11e9ba8f97: Status 404 returned error can't find the container with id 9458fb2d075a95c4bcc103fe317671430e1c5f4c662595eef8ecce11e9ba8f97 Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.582563 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ae344b4db66a070577801bc10c88eba38789d8716316a2af6ef0a9e402274d33"} Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.590411 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"036e299c5ae79ce8a345f77bd5a6ec54ad2e28a6830599e27be3a79d03f92e78"} Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.592415 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9458fb2d075a95c4bcc103fe317671430e1c5f4c662595eef8ecce11e9ba8f97"} Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.698485 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:21 crc kubenswrapper[4681]: I0122 09:04:21.705197 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-c2p8w" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.236091 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:22 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:22 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:22 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.236571 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.247213 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 09:04:22 crc kubenswrapper[4681]: E0122 09:04:22.247596 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" containerName="pruner" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.247672 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" containerName="pruner" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.247840 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6df956-53aa-4c7e-a3cf-ec3c0b99f5f4" containerName="pruner" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.248321 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.255941 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.258006 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.260515 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.400591 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.400658 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.502039 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.502119 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.502196 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.521968 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.603712 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f626865813499b1ba4fdf93fc6549a5ddc383db132138808ad82d4705fdbde9"} Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.603853 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.607694 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ef2f71082e1cc4c799f0efbd9c4d745356296907ee967961e997c2ff9ad03449"} Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.611796 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f2818ef89ef86f3b0f38bcb4247c6acc04555f89012b7a2b9c28d076b2820b08"} Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.616342 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:04:22 crc kubenswrapper[4681]: I0122 09:04:22.715152 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9d4hw" Jan 22 09:04:23 crc kubenswrapper[4681]: I0122 09:04:23.235485 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:23 crc kubenswrapper[4681]: [-]has-synced failed: reason withheld Jan 22 09:04:23 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:23 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:23 crc kubenswrapper[4681]: I0122 09:04:23.235899 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:24 crc kubenswrapper[4681]: I0122 09:04:24.241509 4681 patch_prober.go:28] interesting pod/router-default-5444994796-wcld5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 22 09:04:24 crc kubenswrapper[4681]: [+]has-synced ok Jan 22 09:04:24 crc kubenswrapper[4681]: [+]process-running ok Jan 22 09:04:24 crc kubenswrapper[4681]: healthz check failed Jan 22 09:04:24 crc kubenswrapper[4681]: I0122 09:04:24.241606 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wcld5" podUID="47f80193-b6ae-4185-aa1f-320ba7f8dce9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 22 09:04:25 crc kubenswrapper[4681]: I0122 09:04:25.239891 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:25 crc kubenswrapper[4681]: I0122 09:04:25.246565 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wcld5" Jan 22 09:04:26 crc kubenswrapper[4681]: I0122 09:04:26.167726 4681 patch_prober.go:28] interesting pod/console-f9d7485db-glh9f container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 22 09:04:26 crc kubenswrapper[4681]: I0122 09:04:26.167819 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-glh9f" podUID="c26ba6fb-8b7a-4207-82ed-3b746c50e824" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 22 09:04:26 crc kubenswrapper[4681]: I0122 09:04:26.305052 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fvrbp" Jan 22 09:04:27 crc kubenswrapper[4681]: E0122 09:04:27.385052 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:27 crc kubenswrapper[4681]: E0122 09:04:27.388332 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:27 crc kubenswrapper[4681]: E0122 09:04:27.389970 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:27 crc kubenswrapper[4681]: E0122 09:04:27.390010 4681 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.039730 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.040634 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" containerID="cri-o://5c21c99f8daecc6e2be50252b098f47c71d9e950113ee60a74636005c6191572" gracePeriod=30 Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.119944 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.120557 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" containerID="cri-o://67523b472eff08d46515826d753e0bd9bdfc0c06ed0d3285cd22b291dedcdf2f" gracePeriod=30 Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.702129 4681 generic.go:334] "Generic (PLEG): container finished" podID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerID="67523b472eff08d46515826d753e0bd9bdfc0c06ed0d3285cd22b291dedcdf2f" exitCode=0 Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.702227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" event={"ID":"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c","Type":"ContainerDied","Data":"67523b472eff08d46515826d753e0bd9bdfc0c06ed0d3285cd22b291dedcdf2f"} Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.705140 4681 generic.go:334] "Generic (PLEG): container finished" podID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerID="5c21c99f8daecc6e2be50252b098f47c71d9e950113ee60a74636005c6191572" exitCode=0 Jan 22 09:04:33 crc kubenswrapper[4681]: I0122 09:04:33.705169 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" event={"ID":"47db6098-5a83-4d02-bec9-886b3dd01a4f","Type":"ContainerDied","Data":"5c21c99f8daecc6e2be50252b098f47c71d9e950113ee60a74636005c6191572"} Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.171530 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.177400 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-glh9f" Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.253337 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.343675 4681 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6rt8z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.343743 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.718211 4681 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ckctb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 22 09:04:36 crc kubenswrapper[4681]: I0122 09:04:36.718722 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 22 09:04:37 crc kubenswrapper[4681]: E0122 09:04:37.384509 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:37 crc kubenswrapper[4681]: E0122 09:04:37.387204 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:37 crc kubenswrapper[4681]: E0122 09:04:37.390563 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:37 crc kubenswrapper[4681]: E0122 09:04:37.390619 4681 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:43 crc kubenswrapper[4681]: E0122 09:04:43.695851 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 09:04:43 crc kubenswrapper[4681]: E0122 09:04:43.696725 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9k9q4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h8d59_openshift-marketplace(0e85041c-4d16-4a52-ae78-e3dc2d1e81f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:43 crc kubenswrapper[4681]: E0122 09:04:43.697986 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h8d59" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" Jan 22 09:04:44 crc kubenswrapper[4681]: I0122 09:04:44.800192 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-4xk8b_5334d917-0f71-4e93-ae7a-2a169f3b7a34/kube-multus-additional-cni-plugins/0.log" Jan 22 09:04:44 crc kubenswrapper[4681]: I0122 09:04:44.800934 4681 generic.go:334] "Generic (PLEG): container finished" podID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" exitCode=137 Jan 22 09:04:44 crc kubenswrapper[4681]: I0122 09:04:44.800990 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" event={"ID":"5334d917-0f71-4e93-ae7a-2a169f3b7a34","Type":"ContainerDied","Data":"74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65"} Jan 22 09:04:45 crc kubenswrapper[4681]: I0122 09:04:45.472554 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 22 09:04:45 crc kubenswrapper[4681]: E0122 09:04:45.830667 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 22 09:04:45 crc kubenswrapper[4681]: E0122 09:04:45.830979 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw59x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-q8bfs_openshift-marketplace(41006e44-10b2-443f-b477-8fd39e7b643e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:45 crc kubenswrapper[4681]: E0122 09:04:45.833529 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-q8bfs" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" Jan 22 09:04:46 crc kubenswrapper[4681]: I0122 09:04:46.718628 4681 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ckctb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 22 09:04:46 crc kubenswrapper[4681]: I0122 09:04:46.718761 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 22 09:04:46 crc kubenswrapper[4681]: I0122 09:04:46.861052 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.860988544 podStartE2EDuration="1.860988544s" podCreationTimestamp="2026-01-22 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:04:46.858639774 +0000 UTC m=+77.684550319" watchObservedRunningTime="2026-01-22 09:04:46.860988544 +0000 UTC m=+77.686899079" Jan 22 09:04:47 crc kubenswrapper[4681]: I0122 09:04:47.020466 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9kg44" Jan 22 09:04:47 crc kubenswrapper[4681]: I0122 09:04:47.342191 4681 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6rt8z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 09:04:47 crc kubenswrapper[4681]: I0122 09:04:47.342785 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 09:04:47 crc kubenswrapper[4681]: E0122 09:04:47.383206 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:47 crc kubenswrapper[4681]: E0122 09:04:47.383802 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:47 crc kubenswrapper[4681]: E0122 09:04:47.384338 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:47 crc kubenswrapper[4681]: E0122 09:04:47.384431 4681 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:51 crc kubenswrapper[4681]: E0122 09:04:51.674182 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-q8bfs" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" Jan 22 09:04:51 crc kubenswrapper[4681]: E0122 09:04:51.674795 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h8d59" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" Jan 22 09:04:52 crc kubenswrapper[4681]: E0122 09:04:52.323054 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 09:04:52 crc kubenswrapper[4681]: E0122 09:04:52.323231 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2gw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-krnl8_openshift-marketplace(e6fbfd72-6801-4176-9847-653b6d0d9930): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:52 crc kubenswrapper[4681]: E0122 09:04:52.325199 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-krnl8" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" Jan 22 09:04:53 crc kubenswrapper[4681]: E0122 09:04:53.745532 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 22 09:04:53 crc kubenswrapper[4681]: E0122 09:04:53.746105 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-99lfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sdgpn_openshift-marketplace(9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:53 crc kubenswrapper[4681]: E0122 09:04:53.747545 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sdgpn" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.252844 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.255183 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.271116 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.389474 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.389941 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.491591 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.491673 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.491854 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.524042 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: I0122 09:04:56.589425 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:04:56 crc kubenswrapper[4681]: E0122 09:04:56.752114 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sdgpn" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" Jan 22 09:04:56 crc kubenswrapper[4681]: E0122 09:04:56.752089 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-krnl8" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" Jan 22 09:04:57 crc kubenswrapper[4681]: I0122 09:04:57.341872 4681 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6rt8z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 09:04:57 crc kubenswrapper[4681]: I0122 09:04:57.341966 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 09:04:57 crc kubenswrapper[4681]: E0122 09:04:57.381918 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:57 crc kubenswrapper[4681]: E0122 09:04:57.382801 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:57 crc kubenswrapper[4681]: E0122 09:04:57.383405 4681 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 22 09:04:57 crc kubenswrapper[4681]: E0122 09:04:57.383490 4681 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:57 crc kubenswrapper[4681]: I0122 09:04:57.717927 4681 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ckctb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 22 09:04:57 crc kubenswrapper[4681]: I0122 09:04:57.718040 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.213880 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.214322 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4dj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7pz5k_openshift-marketplace(cafe2ee6-7f62-4d78-8e7e-de58c8506696): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.215656 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7pz5k" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.238312 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.238517 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kz4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5dskr_openshift-marketplace(78f21f15-5d84-4792-b7dd-2ae823beb0b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:58 crc kubenswrapper[4681]: E0122 09:04:58.239802 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5dskr" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.412818 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5dskr" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.413246 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7pz5k" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.539697 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.540715 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.545195 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-4xk8b_5334d917-0f71-4e93-ae7a-2a169f3b7a34/kube-multus-additional-cni-plugins/0.log" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.545235 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.547221 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.547356 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f5j5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-r4tkb_openshift-marketplace(a2cce978-fbc9-46f8-bd29-015898f4977b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.549108 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-r4tkb" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.587559 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.588082 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588094 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.588109 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588116 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.588130 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588136 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588298 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" containerName="controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588312 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" containerName="route-controller-manager" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588326 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" containerName="kube-multus-additional-cni-plugins" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.588755 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.592117 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.592277 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ds2zh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-44cjz_openshift-marketplace(49ca3012-b9cb-46cd-b37c-4a74472c3fef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.593825 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-44cjz" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.612017 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643681 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert\") pod \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643726 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca\") pod \"47db6098-5a83-4d02-bec9-886b3dd01a4f\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643752 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles\") pod \"47db6098-5a83-4d02-bec9-886b3dd01a4f\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643819 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert\") pod \"47db6098-5a83-4d02-bec9-886b3dd01a4f\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643848 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca\") pod \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643892 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config\") pod \"47db6098-5a83-4d02-bec9-886b3dd01a4f\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643927 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config\") pod \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.643973 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kl8k\" (UniqueName: \"kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k\") pod \"47db6098-5a83-4d02-bec9-886b3dd01a4f\" (UID: \"47db6098-5a83-4d02-bec9-886b3dd01a4f\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.644011 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q674l\" (UniqueName: \"kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l\") pod \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\" (UID: \"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.645540 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" (UID: "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.646525 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.646740 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config" (OuterVolumeSpecName: "config") pod "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" (UID: "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.647294 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "47db6098-5a83-4d02-bec9-886b3dd01a4f" (UID: "47db6098-5a83-4d02-bec9-886b3dd01a4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.647350 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47db6098-5a83-4d02-bec9-886b3dd01a4f" (UID: "47db6098-5a83-4d02-bec9-886b3dd01a4f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.647671 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config" (OuterVolumeSpecName: "config") pod "47db6098-5a83-4d02-bec9-886b3dd01a4f" (UID: "47db6098-5a83-4d02-bec9-886b3dd01a4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.653994 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l" (OuterVolumeSpecName: "kube-api-access-q674l") pod "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" (UID: "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c"). InnerVolumeSpecName "kube-api-access-q674l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.653968 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47db6098-5a83-4d02-bec9-886b3dd01a4f" (UID: "47db6098-5a83-4d02-bec9-886b3dd01a4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.654938 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" (UID: "fd2a235c-01fb-4c8c-97d6-cf399c39ab1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.655216 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k" (OuterVolumeSpecName: "kube-api-access-5kl8k") pod "47db6098-5a83-4d02-bec9-886b3dd01a4f" (UID: "47db6098-5a83-4d02-bec9-886b3dd01a4f"). InnerVolumeSpecName "kube-api-access-5kl8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.738430 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") pod \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747279 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready\") pod \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747300 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir\") pod \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747348 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfvn6\" (UniqueName: \"kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6\") pod \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\" (UID: \"5334d917-0f71-4e93-ae7a-2a169f3b7a34\") " Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747480 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747471 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "5334d917-0f71-4e93-ae7a-2a169f3b7a34" (UID: "5334d917-0f71-4e93-ae7a-2a169f3b7a34"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747511 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzn4\" (UniqueName: \"kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747534 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747682 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747857 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747874 4681 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5334d917-0f71-4e93-ae7a-2a169f3b7a34-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747887 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747897 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kl8k\" (UniqueName: \"kubernetes.io/projected/47db6098-5a83-4d02-bec9-886b3dd01a4f-kube-api-access-5kl8k\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747908 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q674l\" (UniqueName: \"kubernetes.io/projected/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-kube-api-access-q674l\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747917 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747911 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready" (OuterVolumeSpecName: "ready") pod "5334d917-0f71-4e93-ae7a-2a169f3b7a34" (UID: "5334d917-0f71-4e93-ae7a-2a169f3b7a34"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747927 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747985 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47db6098-5a83-4d02-bec9-886b3dd01a4f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.747997 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47db6098-5a83-4d02-bec9-886b3dd01a4f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.748040 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "5334d917-0f71-4e93-ae7a-2a169f3b7a34" (UID: "5334d917-0f71-4e93-ae7a-2a169f3b7a34"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: W0122 09:04:59.749538 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod95b4c223_31d2_4bf0_b59e_581984a72a0b.slice/crio-8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d WatchSource:0}: Error finding container 8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d: Status 404 returned error can't find the container with id 8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.750824 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6" (OuterVolumeSpecName: "kube-api-access-jfvn6") pod "5334d917-0f71-4e93-ae7a-2a169f3b7a34" (UID: "5334d917-0f71-4e93-ae7a-2a169f3b7a34"). InnerVolumeSpecName "kube-api-access-jfvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzn4\" (UniqueName: \"kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849381 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849430 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849486 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849529 4681 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5334d917-0f71-4e93-ae7a-2a169f3b7a34-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849542 4681 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/5334d917-0f71-4e93-ae7a-2a169f3b7a34-ready\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.849552 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfvn6\" (UniqueName: \"kubernetes.io/projected/5334d917-0f71-4e93-ae7a-2a169f3b7a34-kube-api-access-jfvn6\") on node \"crc\" DevicePath \"\"" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.852974 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.853779 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.857161 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.875541 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzn4\" (UniqueName: \"kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4\") pod \"route-controller-manager-84ddb9ccd5-sllhg\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.885199 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.910871 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.916861 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.916850 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rt8z" event={"ID":"47db6098-5a83-4d02-bec9-886b3dd01a4f","Type":"ContainerDied","Data":"365d7940ad3c4440c18fdd6b9d4ba1a366a2339172eee119696f753f94179587"} Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.916973 4681 scope.go:117] "RemoveContainer" containerID="5c21c99f8daecc6e2be50252b098f47c71d9e950113ee60a74636005c6191572" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.918636 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42895545-1050-480e-86cb-9591ab3d4e07","Type":"ContainerStarted","Data":"9a298fa0357a427815d869ef65f3b194bdb1b2f8c0e947ccaa6d234f59d583f8"} Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.920403 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"95b4c223-31d2-4bf0-b59e-581984a72a0b","Type":"ContainerStarted","Data":"8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d"} Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.921954 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-4xk8b_5334d917-0f71-4e93-ae7a-2a169f3b7a34/kube-multus-additional-cni-plugins/0.log" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.921996 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" event={"ID":"5334d917-0f71-4e93-ae7a-2a169f3b7a34","Type":"ContainerDied","Data":"c67140e54c0963dfb66ef7efcb88eafb2940484fe07d8026423e8cf1b2f77536"} Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.922050 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-4xk8b" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.925328 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" event={"ID":"fd2a235c-01fb-4c8c-97d6-cf399c39ab1c","Type":"ContainerDied","Data":"fb37ac4c32870ec559225c9092ddd268cc2ce9608a5796940ad2c08d787c7e1e"} Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.925376 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.927424 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-44cjz" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" Jan 22 09:04:59 crc kubenswrapper[4681]: E0122 09:04:59.927608 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-r4tkb" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.951241 4681 scope.go:117] "RemoveContainer" containerID="74a6846856268f9ba15276c66f32d8fc552c15a07e8e4f08c6a28dfa611bdc65" Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.980398 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.982215 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rt8z"] Jan 22 09:04:59 crc kubenswrapper[4681]: I0122 09:04:59.989166 4681 scope.go:117] "RemoveContainer" containerID="67523b472eff08d46515826d753e0bd9bdfc0c06ed0d3285cd22b291dedcdf2f" Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.006690 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-4xk8b"] Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.009853 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-4xk8b"] Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.017826 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.023066 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ckctb"] Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.404039 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.635883 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.942391 4681 generic.go:334] "Generic (PLEG): container finished" podID="95b4c223-31d2-4bf0-b59e-581984a72a0b" containerID="709ae5deb65f54d50b3e9c9245dff19f342caff976e042e170ced42bb8d34b6f" exitCode=0 Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.942458 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"95b4c223-31d2-4bf0-b59e-581984a72a0b","Type":"ContainerDied","Data":"709ae5deb65f54d50b3e9c9245dff19f342caff976e042e170ced42bb8d34b6f"} Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.947184 4681 generic.go:334] "Generic (PLEG): container finished" podID="42895545-1050-480e-86cb-9591ab3d4e07" containerID="e27ae8b5b2bb641724e94ea0548d94c6c52de57ee56c4120a4f36b5dad8cc779" exitCode=0 Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.947223 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42895545-1050-480e-86cb-9591ab3d4e07","Type":"ContainerDied","Data":"e27ae8b5b2bb641724e94ea0548d94c6c52de57ee56c4120a4f36b5dad8cc779"} Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.948709 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" event={"ID":"f75f1bac-c894-483b-a366-854399619cec","Type":"ContainerStarted","Data":"9952b5cd362e51bccf562c2d69349fb7b86a94907475bd04fc0ef7b26507e058"} Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.948734 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" event={"ID":"f75f1bac-c894-483b-a366-854399619cec","Type":"ContainerStarted","Data":"a78042e16d0fcfe135ef226ef73fcecbe6264124fad6799294d129e407b20fd7"} Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.949532 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:05:00 crc kubenswrapper[4681]: I0122 09:05:00.992777 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" podStartSLOduration=7.992749116 podStartE2EDuration="7.992749116s" podCreationTimestamp="2026-01-22 09:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:05:00.988631842 +0000 UTC m=+91.814542357" watchObservedRunningTime="2026-01-22 09:05:00.992749116 +0000 UTC m=+91.818659661" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.038477 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.238799 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.241120 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.246997 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.268391 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.268474 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.268496 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.370723 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.370870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.370912 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.370928 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.371084 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.394258 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access\") pod \"installer-9-crc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.470060 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47db6098-5a83-4d02-bec9-886b3dd01a4f" path="/var/lib/kubelet/pods/47db6098-5a83-4d02-bec9-886b3dd01a4f/volumes" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.471894 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5334d917-0f71-4e93-ae7a-2a169f3b7a34" path="/var/lib/kubelet/pods/5334d917-0f71-4e93-ae7a-2a169f3b7a34/volumes" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.473507 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2a235c-01fb-4c8c-97d6-cf399c39ab1c" path="/var/lib/kubelet/pods/fd2a235c-01fb-4c8c-97d6-cf399c39ab1c/volumes" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.558618 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.947316 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.950859 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.955671 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.955993 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.956194 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.956245 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.957232 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.958084 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.960161 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.968408 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.981476 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.981697 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.981769 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.981822 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrql\" (UniqueName: \"kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:01 crc kubenswrapper[4681]: I0122 09:05:01.981855 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.023175 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.083634 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.084139 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.084169 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.084197 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrql\" (UniqueName: \"kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.084224 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.085695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.086095 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.086657 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.096198 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.111357 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrql\" (UniqueName: \"kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql\") pod \"controller-manager-8cfc7c9b7-jpb9p\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.291646 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.320847 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.346999 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.389920 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access\") pod \"42895545-1050-480e-86cb-9591ab3d4e07\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.390027 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir\") pod \"42895545-1050-480e-86cb-9591ab3d4e07\" (UID: \"42895545-1050-480e-86cb-9591ab3d4e07\") " Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.390077 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir\") pod \"95b4c223-31d2-4bf0-b59e-581984a72a0b\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.390106 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access\") pod \"95b4c223-31d2-4bf0-b59e-581984a72a0b\" (UID: \"95b4c223-31d2-4bf0-b59e-581984a72a0b\") " Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.391383 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "42895545-1050-480e-86cb-9591ab3d4e07" (UID: "42895545-1050-480e-86cb-9591ab3d4e07"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.391472 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "95b4c223-31d2-4bf0-b59e-581984a72a0b" (UID: "95b4c223-31d2-4bf0-b59e-581984a72a0b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.395242 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "95b4c223-31d2-4bf0-b59e-581984a72a0b" (UID: "95b4c223-31d2-4bf0-b59e-581984a72a0b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.395297 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "42895545-1050-480e-86cb-9591ab3d4e07" (UID: "42895545-1050-480e-86cb-9591ab3d4e07"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.492074 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42895545-1050-480e-86cb-9591ab3d4e07-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.492107 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/95b4c223-31d2-4bf0-b59e-581984a72a0b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.492118 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95b4c223-31d2-4bf0-b59e-581984a72a0b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.492131 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42895545-1050-480e-86cb-9591ab3d4e07-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.529054 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:02 crc kubenswrapper[4681]: W0122 09:05:02.534062 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefba4feb_6f7c_4d41_b09c_d622f0e240a9.slice/crio-bc7b9aa4fa861c8d5621a6fa778b8100984632f6b2e112ffef3d140c6b61adeb WatchSource:0}: Error finding container bc7b9aa4fa861c8d5621a6fa778b8100984632f6b2e112ffef3d140c6b61adeb: Status 404 returned error can't find the container with id bc7b9aa4fa861c8d5621a6fa778b8100984632f6b2e112ffef3d140c6b61adeb Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.976898 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"95b4c223-31d2-4bf0-b59e-581984a72a0b","Type":"ContainerDied","Data":"8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.977328 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f2e0c7e509067b51f00d07a25816b32d2e223d2baa444f43b49284de4c6d57d" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.976960 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.980746 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"da4558af-4f26-46a4-839c-a5d77f360bfc","Type":"ContainerStarted","Data":"b93de64843ecbaf485ac7a1abe2b7b3e69d53ee49df43c9a1fec60fcfa11c725"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.981007 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"da4558af-4f26-46a4-839c-a5d77f360bfc","Type":"ContainerStarted","Data":"d06a10d49a5f9f8b34953c342d4498df64f6d7e95ab951816477eb9591c773c8"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.982956 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"42895545-1050-480e-86cb-9591ab3d4e07","Type":"ContainerDied","Data":"9a298fa0357a427815d869ef65f3b194bdb1b2f8c0e947ccaa6d234f59d583f8"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.982990 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a298fa0357a427815d869ef65f3b194bdb1b2f8c0e947ccaa6d234f59d583f8" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.983099 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.984681 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" event={"ID":"efba4feb-6f7c-4d41-b09c-d622f0e240a9","Type":"ContainerStarted","Data":"68f2ddf93b0c1aa71dcbeffa641e72b6dfd1f18549fbfc3d8d45292fcd08043c"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.984747 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" event={"ID":"efba4feb-6f7c-4d41-b09c-d622f0e240a9","Type":"ContainerStarted","Data":"bc7b9aa4fa861c8d5621a6fa778b8100984632f6b2e112ffef3d140c6b61adeb"} Jan 22 09:05:02 crc kubenswrapper[4681]: I0122 09:05:02.985036 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:03 crc kubenswrapper[4681]: I0122 09:05:03.005695 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:03 crc kubenswrapper[4681]: I0122 09:05:03.012173 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.012144923 podStartE2EDuration="2.012144923s" podCreationTimestamp="2026-01-22 09:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:05:03.011429985 +0000 UTC m=+93.837340500" watchObservedRunningTime="2026-01-22 09:05:03.012144923 +0000 UTC m=+93.838055428" Jan 22 09:05:03 crc kubenswrapper[4681]: I0122 09:05:03.039675 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" podStartSLOduration=10.03965139 podStartE2EDuration="10.03965139s" podCreationTimestamp="2026-01-22 09:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:05:03.03611091 +0000 UTC m=+93.862021425" watchObservedRunningTime="2026-01-22 09:05:03.03965139 +0000 UTC m=+93.865561895" Jan 22 09:05:05 crc kubenswrapper[4681]: I0122 09:05:05.003965 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerStarted","Data":"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75"} Jan 22 09:05:06 crc kubenswrapper[4681]: I0122 09:05:06.013545 4681 generic.go:334] "Generic (PLEG): container finished" podID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerID="b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75" exitCode=0 Jan 22 09:05:06 crc kubenswrapper[4681]: I0122 09:05:06.013599 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerDied","Data":"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75"} Jan 22 09:05:07 crc kubenswrapper[4681]: I0122 09:05:07.027128 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerStarted","Data":"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118"} Jan 22 09:05:07 crc kubenswrapper[4681]: I0122 09:05:07.056613 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8d59" podStartSLOduration=2.857927885 podStartE2EDuration="54.056587038s" podCreationTimestamp="2026-01-22 09:04:13 +0000 UTC" firstStartedPulling="2026-01-22 09:04:15.341482678 +0000 UTC m=+46.167393183" lastFinishedPulling="2026-01-22 09:05:06.540141801 +0000 UTC m=+97.366052336" observedRunningTime="2026-01-22 09:05:07.052861784 +0000 UTC m=+97.878772289" watchObservedRunningTime="2026-01-22 09:05:07.056587038 +0000 UTC m=+97.882497553" Jan 22 09:05:08 crc kubenswrapper[4681]: I0122 09:05:08.035613 4681 generic.go:334] "Generic (PLEG): container finished" podID="41006e44-10b2-443f-b477-8fd39e7b643e" containerID="8ae4c818b4deb95d19720bc72335e1270c96dbcfff6ac2a2e8c68b74b33c22ec" exitCode=0 Jan 22 09:05:08 crc kubenswrapper[4681]: I0122 09:05:08.035781 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerDied","Data":"8ae4c818b4deb95d19720bc72335e1270c96dbcfff6ac2a2e8c68b74b33c22ec"} Jan 22 09:05:09 crc kubenswrapper[4681]: I0122 09:05:09.044174 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerStarted","Data":"f19eb933a5e2460214bcd391c7cc3ae23d2940acae5f5100fe5fea6b9f3d1b39"} Jan 22 09:05:09 crc kubenswrapper[4681]: I0122 09:05:09.048124 4681 generic.go:334] "Generic (PLEG): container finished" podID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerID="4d0d36534e5b37497c4d7818cde74400be04e5592041ac9bd92891a5cde0fcd5" exitCode=0 Jan 22 09:05:09 crc kubenswrapper[4681]: I0122 09:05:09.048156 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerDied","Data":"4d0d36534e5b37497c4d7818cde74400be04e5592041ac9bd92891a5cde0fcd5"} Jan 22 09:05:09 crc kubenswrapper[4681]: I0122 09:05:09.070436 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q8bfs" podStartSLOduration=2.778713331 podStartE2EDuration="56.070401944s" podCreationTimestamp="2026-01-22 09:04:13 +0000 UTC" firstStartedPulling="2026-01-22 09:04:15.31582232 +0000 UTC m=+46.141732825" lastFinishedPulling="2026-01-22 09:05:08.607510893 +0000 UTC m=+99.433421438" observedRunningTime="2026-01-22 09:05:09.068318151 +0000 UTC m=+99.894228656" watchObservedRunningTime="2026-01-22 09:05:09.070401944 +0000 UTC m=+99.896312449" Jan 22 09:05:10 crc kubenswrapper[4681]: I0122 09:05:10.057432 4681 generic.go:334] "Generic (PLEG): container finished" podID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerID="ecb7a618b93641938749213262354966dd5f0d743ae93d0dce6e50ed4cae4692" exitCode=0 Jan 22 09:05:10 crc kubenswrapper[4681]: I0122 09:05:10.057533 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerDied","Data":"ecb7a618b93641938749213262354966dd5f0d743ae93d0dce6e50ed4cae4692"} Jan 22 09:05:11 crc kubenswrapper[4681]: I0122 09:05:11.066302 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerStarted","Data":"dd2b558abda30b4f3c000ef8bffe9f74c225b303f83624948a59d663930bf802"} Jan 22 09:05:11 crc kubenswrapper[4681]: I0122 09:05:11.069017 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerStarted","Data":"3c5465d5080b6a5c95965d29f8d5821cd64dcddf225d1fff97bf768940c1dc13"} Jan 22 09:05:11 crc kubenswrapper[4681]: I0122 09:05:11.088542 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-krnl8" podStartSLOduration=3.839698871 podStartE2EDuration="58.088520287s" podCreationTimestamp="2026-01-22 09:04:13 +0000 UTC" firstStartedPulling="2026-01-22 09:04:15.327908109 +0000 UTC m=+46.153818614" lastFinishedPulling="2026-01-22 09:05:09.576729515 +0000 UTC m=+100.402640030" observedRunningTime="2026-01-22 09:05:11.087132522 +0000 UTC m=+101.913043027" watchObservedRunningTime="2026-01-22 09:05:11.088520287 +0000 UTC m=+101.914430802" Jan 22 09:05:11 crc kubenswrapper[4681]: I0122 09:05:11.108041 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sdgpn" podStartSLOduration=3.5935308360000002 podStartE2EDuration="59.108014381s" podCreationTimestamp="2026-01-22 09:04:12 +0000 UTC" firstStartedPulling="2026-01-22 09:04:15.34270749 +0000 UTC m=+46.168617995" lastFinishedPulling="2026-01-22 09:05:10.857191035 +0000 UTC m=+101.683101540" observedRunningTime="2026-01-22 09:05:11.106591285 +0000 UTC m=+101.932501800" watchObservedRunningTime="2026-01-22 09:05:11.108014381 +0000 UTC m=+101.933924886" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.120184 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerStarted","Data":"a1f4dbe1997715334cb994382ab85cff0ae50d240e204636f4a94e6021907030"} Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.334072 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.334153 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.413416 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.830938 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.831420 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:13 crc kubenswrapper[4681]: I0122 09:05:13.874808 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.128016 4681 generic.go:334] "Generic (PLEG): container finished" podID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerID="f0f4c642c61b9d8b5d0cd90badec96118e3a06c5231370c8b787f0fc1e694d47" exitCode=0 Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.128097 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerDied","Data":"f0f4c642c61b9d8b5d0cd90badec96118e3a06c5231370c8b787f0fc1e694d47"} Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.135776 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.135831 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.138351 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.138408 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerDied","Data":"a1f4dbe1997715334cb994382ab85cff0ae50d240e204636f4a94e6021907030"} Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.138442 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.138428 4681 generic.go:334] "Generic (PLEG): container finished" podID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerID="a1f4dbe1997715334cb994382ab85cff0ae50d240e204636f4a94e6021907030" exitCode=0 Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.189936 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:14 crc kubenswrapper[4681]: I0122 09:05:14.196380 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:05:15 crc kubenswrapper[4681]: I0122 09:05:15.194355 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:15 crc kubenswrapper[4681]: I0122 09:05:15.211591 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.154337 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerStarted","Data":"148902819eb977b50de1ba3c36ac5e217f759b5abc4a119bebbe10a3e01d4ef6"} Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.156378 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerStarted","Data":"b28ff44948658b8cc9c396d6bb4d7bf57abe59549fbf2acba14b0feea55950cd"} Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.158446 4681 generic.go:334] "Generic (PLEG): container finished" podID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerID="1598ce951490d8df05df022a77b7d8e4252d3ae0ac7f7ecd44f1d05fdcff4871" exitCode=0 Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.158535 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerDied","Data":"1598ce951490d8df05df022a77b7d8e4252d3ae0ac7f7ecd44f1d05fdcff4871"} Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.162469 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerStarted","Data":"366a4e110862a966430b777c8ce0cf25984ff4176216fe69159c26d0a936a413"} Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.179771 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r4tkb" podStartSLOduration=2.187216384 podStartE2EDuration="1m1.179557217s" podCreationTimestamp="2026-01-22 09:04:15 +0000 UTC" firstStartedPulling="2026-01-22 09:04:16.409485624 +0000 UTC m=+47.235396129" lastFinishedPulling="2026-01-22 09:05:15.401826457 +0000 UTC m=+106.227736962" observedRunningTime="2026-01-22 09:05:16.175583946 +0000 UTC m=+107.001494451" watchObservedRunningTime="2026-01-22 09:05:16.179557217 +0000 UTC m=+107.005467752" Jan 22 09:05:16 crc kubenswrapper[4681]: I0122 09:05:16.215186 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5dskr" podStartSLOduration=3.211925113 podStartE2EDuration="1m0.215163399s" podCreationTimestamp="2026-01-22 09:04:16 +0000 UTC" firstStartedPulling="2026-01-22 09:04:18.572520559 +0000 UTC m=+49.398431064" lastFinishedPulling="2026-01-22 09:05:15.575758845 +0000 UTC m=+106.401669350" observedRunningTime="2026-01-22 09:05:16.211996599 +0000 UTC m=+107.037907114" watchObservedRunningTime="2026-01-22 09:05:16.215163399 +0000 UTC m=+107.041073904" Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.045942 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.046609 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.170375 4681 generic.go:334] "Generic (PLEG): container finished" podID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerID="b28ff44948658b8cc9c396d6bb4d7bf57abe59549fbf2acba14b0feea55950cd" exitCode=0 Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.170770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerDied","Data":"b28ff44948658b8cc9c396d6bb4d7bf57abe59549fbf2acba14b0feea55950cd"} Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.174210 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerStarted","Data":"36340238d503d3c6649d5cb6b1c5b227c3f92280c661d33d4e928198107e86c0"} Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.209125 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44cjz" podStartSLOduration=2.73335194 podStartE2EDuration="1m3.209095048s" podCreationTimestamp="2026-01-22 09:04:14 +0000 UTC" firstStartedPulling="2026-01-22 09:04:16.426240127 +0000 UTC m=+47.252150632" lastFinishedPulling="2026-01-22 09:05:16.901983195 +0000 UTC m=+107.727893740" observedRunningTime="2026-01-22 09:05:17.207277572 +0000 UTC m=+108.033188077" watchObservedRunningTime="2026-01-22 09:05:17.209095048 +0000 UTC m=+108.035005563" Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.854092 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:05:17 crc kubenswrapper[4681]: I0122 09:05:17.856096 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q8bfs" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="registry-server" containerID="cri-o://f19eb933a5e2460214bcd391c7cc3ae23d2940acae5f5100fe5fea6b9f3d1b39" gracePeriod=2 Jan 22 09:05:18 crc kubenswrapper[4681]: I0122 09:05:18.089424 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5dskr" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="registry-server" probeResult="failure" output=< Jan 22 09:05:18 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:05:18 crc kubenswrapper[4681]: > Jan 22 09:05:18 crc kubenswrapper[4681]: I0122 09:05:18.186814 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerStarted","Data":"26347e5fe28e5a546dd4dcf5960db288addc2253d6e3f89b83e345f634e16c6a"} Jan 22 09:05:18 crc kubenswrapper[4681]: I0122 09:05:18.208051 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pz5k" podStartSLOduration=2.10313573 podStartE2EDuration="1m2.208031443s" podCreationTimestamp="2026-01-22 09:04:16 +0000 UTC" firstStartedPulling="2026-01-22 09:04:17.510243775 +0000 UTC m=+48.336154280" lastFinishedPulling="2026-01-22 09:05:17.615139488 +0000 UTC m=+108.441049993" observedRunningTime="2026-01-22 09:05:18.204824612 +0000 UTC m=+109.030735127" watchObservedRunningTime="2026-01-22 09:05:18.208031443 +0000 UTC m=+109.033941948" Jan 22 09:05:19 crc kubenswrapper[4681]: I0122 09:05:19.195189 4681 generic.go:334] "Generic (PLEG): container finished" podID="41006e44-10b2-443f-b477-8fd39e7b643e" containerID="f19eb933a5e2460214bcd391c7cc3ae23d2940acae5f5100fe5fea6b9f3d1b39" exitCode=0 Jan 22 09:05:19 crc kubenswrapper[4681]: I0122 09:05:19.195274 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerDied","Data":"f19eb933a5e2460214bcd391c7cc3ae23d2940acae5f5100fe5fea6b9f3d1b39"} Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.114580 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.204471 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8bfs" event={"ID":"41006e44-10b2-443f-b477-8fd39e7b643e","Type":"ContainerDied","Data":"ca71f854fbd73db44c60d595334cd0f8322b6f5871bfab7e709c43e1a564caea"} Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.204533 4681 scope.go:117] "RemoveContainer" containerID="f19eb933a5e2460214bcd391c7cc3ae23d2940acae5f5100fe5fea6b9f3d1b39" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.204578 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8bfs" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.223093 4681 scope.go:117] "RemoveContainer" containerID="8ae4c818b4deb95d19720bc72335e1270c96dbcfff6ac2a2e8c68b74b33c22ec" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.235775 4681 scope.go:117] "RemoveContainer" containerID="ad0bd6b48fc4e99ab0567d9340e5374bdc68731c6e62e534a79cb7e4f3d5aedb" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.255090 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content\") pod \"41006e44-10b2-443f-b477-8fd39e7b643e\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.255248 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities\") pod \"41006e44-10b2-443f-b477-8fd39e7b643e\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.255306 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw59x\" (UniqueName: \"kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x\") pod \"41006e44-10b2-443f-b477-8fd39e7b643e\" (UID: \"41006e44-10b2-443f-b477-8fd39e7b643e\") " Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.256400 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities" (OuterVolumeSpecName: "utilities") pod "41006e44-10b2-443f-b477-8fd39e7b643e" (UID: "41006e44-10b2-443f-b477-8fd39e7b643e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.264918 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x" (OuterVolumeSpecName: "kube-api-access-tw59x") pod "41006e44-10b2-443f-b477-8fd39e7b643e" (UID: "41006e44-10b2-443f-b477-8fd39e7b643e"). InnerVolumeSpecName "kube-api-access-tw59x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.298039 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41006e44-10b2-443f-b477-8fd39e7b643e" (UID: "41006e44-10b2-443f-b477-8fd39e7b643e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.357004 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.357043 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw59x\" (UniqueName: \"kubernetes.io/projected/41006e44-10b2-443f-b477-8fd39e7b643e-kube-api-access-tw59x\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.357055 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41006e44-10b2-443f-b477-8fd39e7b643e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.548540 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:05:20 crc kubenswrapper[4681]: I0122 09:05:20.552651 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q8bfs"] Jan 22 09:05:21 crc kubenswrapper[4681]: I0122 09:05:21.461939 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" path="/var/lib/kubelet/pods/41006e44-10b2-443f-b477-8fd39e7b643e/volumes" Jan 22 09:05:23 crc kubenswrapper[4681]: I0122 09:05:23.386056 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:05:23 crc kubenswrapper[4681]: I0122 09:05:23.875384 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:24 crc kubenswrapper[4681]: I0122 09:05:24.861538 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:05:24 crc kubenswrapper[4681]: I0122 09:05:24.862203 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-krnl8" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="registry-server" containerID="cri-o://dd2b558abda30b4f3c000ef8bffe9f74c225b303f83624948a59d663930bf802" gracePeriod=2 Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.207567 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.207650 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.268659 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.342226 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.602048 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.602134 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:25 crc kubenswrapper[4681]: I0122 09:05:25.673644 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:26 crc kubenswrapper[4681]: I0122 09:05:26.297185 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:26 crc kubenswrapper[4681]: I0122 09:05:26.624400 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:05:26 crc kubenswrapper[4681]: I0122 09:05:26.624726 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:05:26 crc kubenswrapper[4681]: I0122 09:05:26.662189 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:05:27 crc kubenswrapper[4681]: I0122 09:05:27.109390 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:27 crc kubenswrapper[4681]: I0122 09:05:27.182546 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:27 crc kubenswrapper[4681]: I0122 09:05:27.313982 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:05:28 crc kubenswrapper[4681]: I0122 09:05:28.259021 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:05:28 crc kubenswrapper[4681]: I0122 09:05:28.267102 4681 generic.go:334] "Generic (PLEG): container finished" podID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerID="dd2b558abda30b4f3c000ef8bffe9f74c225b303f83624948a59d663930bf802" exitCode=0 Jan 22 09:05:28 crc kubenswrapper[4681]: I0122 09:05:28.267444 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerDied","Data":"dd2b558abda30b4f3c000ef8bffe9f74c225b303f83624948a59d663930bf802"} Jan 22 09:05:28 crc kubenswrapper[4681]: I0122 09:05:28.267743 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r4tkb" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="registry-server" containerID="cri-o://148902819eb977b50de1ba3c36ac5e217f759b5abc4a119bebbe10a3e01d4ef6" gracePeriod=2 Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.278604 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krnl8" event={"ID":"e6fbfd72-6801-4176-9847-653b6d0d9930","Type":"ContainerDied","Data":"bee6efed67d1b30698b166ce690bbad2b502c7d021dfd68f3dd4534248fed688"} Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.278915 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee6efed67d1b30698b166ce690bbad2b502c7d021dfd68f3dd4534248fed688" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.281346 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.385776 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities\") pod \"e6fbfd72-6801-4176-9847-653b6d0d9930\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.385839 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content\") pod \"e6fbfd72-6801-4176-9847-653b6d0d9930\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.385917 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2gw4\" (UniqueName: \"kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4\") pod \"e6fbfd72-6801-4176-9847-653b6d0d9930\" (UID: \"e6fbfd72-6801-4176-9847-653b6d0d9930\") " Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.387521 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities" (OuterVolumeSpecName: "utilities") pod "e6fbfd72-6801-4176-9847-653b6d0d9930" (UID: "e6fbfd72-6801-4176-9847-653b6d0d9930"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.394749 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4" (OuterVolumeSpecName: "kube-api-access-k2gw4") pod "e6fbfd72-6801-4176-9847-653b6d0d9930" (UID: "e6fbfd72-6801-4176-9847-653b6d0d9930"). InnerVolumeSpecName "kube-api-access-k2gw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.463313 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6fbfd72-6801-4176-9847-653b6d0d9930" (UID: "e6fbfd72-6801-4176-9847-653b6d0d9930"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.489053 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.489095 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fbfd72-6801-4176-9847-653b6d0d9930-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:29 crc kubenswrapper[4681]: I0122 09:05:29.489482 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2gw4\" (UniqueName: \"kubernetes.io/projected/e6fbfd72-6801-4176-9847-653b6d0d9930-kube-api-access-k2gw4\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.292846 4681 generic.go:334] "Generic (PLEG): container finished" podID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerID="148902819eb977b50de1ba3c36ac5e217f759b5abc4a119bebbe10a3e01d4ef6" exitCode=0 Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.292922 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerDied","Data":"148902819eb977b50de1ba3c36ac5e217f759b5abc4a119bebbe10a3e01d4ef6"} Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.293012 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krnl8" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.322175 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.326744 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-krnl8"] Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.654505 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.655059 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5dskr" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="registry-server" containerID="cri-o://366a4e110862a966430b777c8ce0cf25984ff4176216fe69159c26d0a936a413" gracePeriod=2 Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.736282 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.738377 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content\") pod \"a2cce978-fbc9-46f8-bd29-015898f4977b\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.738440 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities\") pod \"a2cce978-fbc9-46f8-bd29-015898f4977b\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.738472 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f5j5\" (UniqueName: \"kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5\") pod \"a2cce978-fbc9-46f8-bd29-015898f4977b\" (UID: \"a2cce978-fbc9-46f8-bd29-015898f4977b\") " Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.740351 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities" (OuterVolumeSpecName: "utilities") pod "a2cce978-fbc9-46f8-bd29-015898f4977b" (UID: "a2cce978-fbc9-46f8-bd29-015898f4977b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.746412 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5" (OuterVolumeSpecName: "kube-api-access-7f5j5") pod "a2cce978-fbc9-46f8-bd29-015898f4977b" (UID: "a2cce978-fbc9-46f8-bd29-015898f4977b"). InnerVolumeSpecName "kube-api-access-7f5j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.773481 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2cce978-fbc9-46f8-bd29-015898f4977b" (UID: "a2cce978-fbc9-46f8-bd29-015898f4977b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.839051 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.839079 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f5j5\" (UniqueName: \"kubernetes.io/projected/a2cce978-fbc9-46f8-bd29-015898f4977b-kube-api-access-7f5j5\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:30 crc kubenswrapper[4681]: I0122 09:05:30.839090 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cce978-fbc9-46f8-bd29-015898f4977b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.303012 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4tkb" event={"ID":"a2cce978-fbc9-46f8-bd29-015898f4977b","Type":"ContainerDied","Data":"67c4ded127efe95c536c2570806e352cd01a3f2f98e48e6ad59eeeeb3faf8a3d"} Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.303063 4681 scope.go:117] "RemoveContainer" containerID="148902819eb977b50de1ba3c36ac5e217f759b5abc4a119bebbe10a3e01d4ef6" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.303072 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4tkb" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.333186 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.335197 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4tkb"] Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.461201 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" path="/var/lib/kubelet/pods/a2cce978-fbc9-46f8-bd29-015898f4977b/volumes" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.462502 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" path="/var/lib/kubelet/pods/e6fbfd72-6801-4176-9847-653b6d0d9930/volumes" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.470658 4681 scope.go:117] "RemoveContainer" containerID="f0f4c642c61b9d8b5d0cd90badec96118e3a06c5231370c8b787f0fc1e694d47" Jan 22 09:05:31 crc kubenswrapper[4681]: I0122 09:05:31.553136 4681 scope.go:117] "RemoveContainer" containerID="48205f1210faf9932d845e7de7db8345f271e6c644654cfd465e37e13ae11829" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.313979 4681 generic.go:334] "Generic (PLEG): container finished" podID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerID="366a4e110862a966430b777c8ce0cf25984ff4176216fe69159c26d0a936a413" exitCode=0 Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.314062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerDied","Data":"366a4e110862a966430b777c8ce0cf25984ff4176216fe69159c26d0a936a413"} Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.314412 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dskr" event={"ID":"78f21f15-5d84-4792-b7dd-2ae823beb0b0","Type":"ContainerDied","Data":"0c678d250d47d3f62e9355e80cfea2ccebb1af2653f20870c82b5c6f84c2881e"} Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.314438 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c678d250d47d3f62e9355e80cfea2ccebb1af2653f20870c82b5c6f84c2881e" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.357606 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.460504 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities\") pod \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.460699 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content\") pod \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.460798 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kz4f\" (UniqueName: \"kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f\") pod \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\" (UID: \"78f21f15-5d84-4792-b7dd-2ae823beb0b0\") " Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.461788 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities" (OuterVolumeSpecName: "utilities") pod "78f21f15-5d84-4792-b7dd-2ae823beb0b0" (UID: "78f21f15-5d84-4792-b7dd-2ae823beb0b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.470027 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f" (OuterVolumeSpecName: "kube-api-access-8kz4f") pod "78f21f15-5d84-4792-b7dd-2ae823beb0b0" (UID: "78f21f15-5d84-4792-b7dd-2ae823beb0b0"). InnerVolumeSpecName "kube-api-access-8kz4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.562207 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kz4f\" (UniqueName: \"kubernetes.io/projected/78f21f15-5d84-4792-b7dd-2ae823beb0b0-kube-api-access-8kz4f\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.562251 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.628108 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78f21f15-5d84-4792-b7dd-2ae823beb0b0" (UID: "78f21f15-5d84-4792-b7dd-2ae823beb0b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:05:32 crc kubenswrapper[4681]: I0122 09:05:32.667007 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78f21f15-5d84-4792-b7dd-2ae823beb0b0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.031696 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.031942 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" podUID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" containerName="controller-manager" containerID="cri-o://68f2ddf93b0c1aa71dcbeffa641e72b6dfd1f18549fbfc3d8d45292fcd08043c" gracePeriod=30 Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.128304 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.128556 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" podUID="f75f1bac-c894-483b-a366-854399619cec" containerName="route-controller-manager" containerID="cri-o://9952b5cd362e51bccf562c2d69349fb7b86a94907475bd04fc0ef7b26507e058" gracePeriod=30 Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.323788 4681 generic.go:334] "Generic (PLEG): container finished" podID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" containerID="68f2ddf93b0c1aa71dcbeffa641e72b6dfd1f18549fbfc3d8d45292fcd08043c" exitCode=0 Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.323862 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" event={"ID":"efba4feb-6f7c-4d41-b09c-d622f0e240a9","Type":"ContainerDied","Data":"68f2ddf93b0c1aa71dcbeffa641e72b6dfd1f18549fbfc3d8d45292fcd08043c"} Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.325370 4681 generic.go:334] "Generic (PLEG): container finished" podID="f75f1bac-c894-483b-a366-854399619cec" containerID="9952b5cd362e51bccf562c2d69349fb7b86a94907475bd04fc0ef7b26507e058" exitCode=0 Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.325452 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dskr" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.326340 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" event={"ID":"f75f1bac-c894-483b-a366-854399619cec","Type":"ContainerDied","Data":"9952b5cd362e51bccf562c2d69349fb7b86a94907475bd04fc0ef7b26507e058"} Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.361459 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.364686 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5dskr"] Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.461297 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" path="/var/lib/kubelet/pods/78f21f15-5d84-4792-b7dd-2ae823beb0b0/volumes" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.585897 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.591173 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682172 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxzn4\" (UniqueName: \"kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4\") pod \"f75f1bac-c894-483b-a366-854399619cec\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682230 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmrql\" (UniqueName: \"kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql\") pod \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682278 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles\") pod \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682321 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config\") pod \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682368 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca\") pod \"f75f1bac-c894-483b-a366-854399619cec\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682396 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert\") pod \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682421 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert\") pod \"f75f1bac-c894-483b-a366-854399619cec\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682442 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca\") pod \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\" (UID: \"efba4feb-6f7c-4d41-b09c-d622f0e240a9\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.682466 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config\") pod \"f75f1bac-c894-483b-a366-854399619cec\" (UID: \"f75f1bac-c894-483b-a366-854399619cec\") " Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.683011 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "efba4feb-6f7c-4d41-b09c-d622f0e240a9" (UID: "efba4feb-6f7c-4d41-b09c-d622f0e240a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.683010 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca" (OuterVolumeSpecName: "client-ca") pod "f75f1bac-c894-483b-a366-854399619cec" (UID: "f75f1bac-c894-483b-a366-854399619cec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.683114 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config" (OuterVolumeSpecName: "config") pod "f75f1bac-c894-483b-a366-854399619cec" (UID: "f75f1bac-c894-483b-a366-854399619cec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.683215 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config" (OuterVolumeSpecName: "config") pod "efba4feb-6f7c-4d41-b09c-d622f0e240a9" (UID: "efba4feb-6f7c-4d41-b09c-d622f0e240a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.683665 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "efba4feb-6f7c-4d41-b09c-d622f0e240a9" (UID: "efba4feb-6f7c-4d41-b09c-d622f0e240a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.687377 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4" (OuterVolumeSpecName: "kube-api-access-qxzn4") pod "f75f1bac-c894-483b-a366-854399619cec" (UID: "f75f1bac-c894-483b-a366-854399619cec"). InnerVolumeSpecName "kube-api-access-qxzn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.687618 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "efba4feb-6f7c-4d41-b09c-d622f0e240a9" (UID: "efba4feb-6f7c-4d41-b09c-d622f0e240a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.687644 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f75f1bac-c894-483b-a366-854399619cec" (UID: "f75f1bac-c894-483b-a366-854399619cec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.689557 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql" (OuterVolumeSpecName: "kube-api-access-kmrql") pod "efba4feb-6f7c-4d41-b09c-d622f0e240a9" (UID: "efba4feb-6f7c-4d41-b09c-d622f0e240a9"). InnerVolumeSpecName "kube-api-access-kmrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787237 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxzn4\" (UniqueName: \"kubernetes.io/projected/f75f1bac-c894-483b-a366-854399619cec-kube-api-access-qxzn4\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787339 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmrql\" (UniqueName: \"kubernetes.io/projected/efba4feb-6f7c-4d41-b09c-d622f0e240a9-kube-api-access-kmrql\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787368 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787394 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787421 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787445 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efba4feb-6f7c-4d41-b09c-d622f0e240a9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787471 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f75f1bac-c894-483b-a366-854399619cec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787494 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4feb-6f7c-4d41-b09c-d622f0e240a9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:33 crc kubenswrapper[4681]: I0122 09:05:33.787517 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f75f1bac-c894-483b-a366-854399619cec-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.330246 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" event={"ID":"f75f1bac-c894-483b-a366-854399619cec","Type":"ContainerDied","Data":"a78042e16d0fcfe135ef226ef73fcecbe6264124fad6799294d129e407b20fd7"} Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.330331 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.330715 4681 scope.go:117] "RemoveContainer" containerID="9952b5cd362e51bccf562c2d69349fb7b86a94907475bd04fc0ef7b26507e058" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.332111 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" event={"ID":"efba4feb-6f7c-4d41-b09c-d622f0e240a9","Type":"ContainerDied","Data":"bc7b9aa4fa861c8d5621a6fa778b8100984632f6b2e112ffef3d140c6b61adeb"} Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.332167 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.364325 4681 scope.go:117] "RemoveContainer" containerID="68f2ddf93b0c1aa71dcbeffa641e72b6dfd1f18549fbfc3d8d45292fcd08043c" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.378550 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.402456 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8cfc7c9b7-jpb9p"] Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.428671 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.431041 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84ddb9ccd5-sllhg"] Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.451129 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dmfxj"] Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.970947 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971256 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971309 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971328 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42895545-1050-480e-86cb-9591ab3d4e07" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971341 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="42895545-1050-480e-86cb-9591ab3d4e07" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971360 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971373 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971391 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971402 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971415 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971428 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971442 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971454 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971472 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75f1bac-c894-483b-a366-854399619cec" containerName="route-controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971486 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75f1bac-c894-483b-a366-854399619cec" containerName="route-controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971509 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971521 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971538 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971550 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="extract-utilities" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971566 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971580 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971605 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" containerName="controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971618 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" containerName="controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971635 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971646 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971664 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971675 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971691 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971703 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971720 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971732 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="extract-content" Jan 22 09:05:34 crc kubenswrapper[4681]: E0122 09:05:34.971751 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b4c223-31d2-4bf0-b59e-581984a72a0b" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.971762 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b4c223-31d2-4bf0-b59e-581984a72a0b" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972020 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b4c223-31d2-4bf0-b59e-581984a72a0b" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972046 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f21f15-5d84-4792-b7dd-2ae823beb0b0" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972066 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fbfd72-6801-4176-9847-653b6d0d9930" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972081 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2cce978-fbc9-46f8-bd29-015898f4977b" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972097 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="42895545-1050-480e-86cb-9591ab3d4e07" containerName="pruner" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972118 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="41006e44-10b2-443f-b477-8fd39e7b643e" containerName="registry-server" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972131 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75f1bac-c894-483b-a366-854399619cec" containerName="route-controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972152 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" containerName="controller-manager" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.972755 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:34 crc kubenswrapper[4681]: I0122 09:05:34.975128 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.976217 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.976954 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.977017 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.979667 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.980130 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.980305 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.980868 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.981007 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.982181 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.982350 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.982610 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.982805 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.983394 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.983523 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.986506 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:34.993793 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.004696 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bgr\" (UniqueName: \"kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.004752 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.004786 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.004919 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.005015 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.005058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sfl\" (UniqueName: \"kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.005112 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.005304 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.005353 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106721 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106792 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106841 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106880 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sfl\" (UniqueName: \"kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106911 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.106974 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.107009 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.107074 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bgr\" (UniqueName: \"kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.107116 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.108819 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.108916 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.108993 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.109177 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.109734 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.111455 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.113184 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.123533 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bgr\" (UniqueName: \"kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr\") pod \"controller-manager-7664b4d6d8-f5xfm\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.132787 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sfl\" (UniqueName: \"kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl\") pod \"route-controller-manager-6b8b4b6776-gjrdg\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.336324 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.354106 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.460741 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efba4feb-6f7c-4d41-b09c-d622f0e240a9" path="/var/lib/kubelet/pods/efba4feb-6f7c-4d41-b09c-d622f0e240a9/volumes" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.461427 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75f1bac-c894-483b-a366-854399619cec" path="/var/lib/kubelet/pods/f75f1bac-c894-483b-a366-854399619cec/volumes" Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.729350 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:05:35 crc kubenswrapper[4681]: W0122 09:05:35.736911 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88771d4a_dace_45a4_bd87_9f25aaa52f50.slice/crio-3261ea3c67323766572033641e11e9d0a1a130754a03fea23a58d53b744ce697 WatchSource:0}: Error finding container 3261ea3c67323766572033641e11e9d0a1a130754a03fea23a58d53b744ce697: Status 404 returned error can't find the container with id 3261ea3c67323766572033641e11e9d0a1a130754a03fea23a58d53b744ce697 Jan 22 09:05:35 crc kubenswrapper[4681]: I0122 09:05:35.797670 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.347062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" event={"ID":"045520f1-ee73-440d-84a0-ebc935904fa2","Type":"ContainerStarted","Data":"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217"} Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.347360 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" event={"ID":"045520f1-ee73-440d-84a0-ebc935904fa2","Type":"ContainerStarted","Data":"b85d5ae4c3ab46ccc79e5ecdb279694cad49b61e94fa94dba10bb842e66ff771"} Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.347377 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.349576 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" event={"ID":"88771d4a-dace-45a4-bd87-9f25aaa52f50","Type":"ContainerStarted","Data":"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c"} Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.349607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" event={"ID":"88771d4a-dace-45a4-bd87-9f25aaa52f50","Type":"ContainerStarted","Data":"3261ea3c67323766572033641e11e9d0a1a130754a03fea23a58d53b744ce697"} Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.349791 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.358050 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.367221 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" podStartSLOduration=3.36720445 podStartE2EDuration="3.36720445s" podCreationTimestamp="2026-01-22 09:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:05:36.365858036 +0000 UTC m=+127.191768541" watchObservedRunningTime="2026-01-22 09:05:36.36720445 +0000 UTC m=+127.193114945" Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.374847 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:05:36 crc kubenswrapper[4681]: I0122 09:05:36.393690 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" podStartSLOduration=3.393672231 podStartE2EDuration="3.393672231s" podCreationTimestamp="2026-01-22 09:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:05:36.390932832 +0000 UTC m=+127.216843337" watchObservedRunningTime="2026-01-22 09:05:36.393672231 +0000 UTC m=+127.219582736" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.266877 4681 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.268044 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.311157 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357112 4681 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357508 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598" gracePeriod=15 Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357581 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525" gracePeriod=15 Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357657 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c" gracePeriod=15 Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357890 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f" gracePeriod=15 Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.357928 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57" gracePeriod=15 Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359160 4681 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359471 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359491 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359509 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359519 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359538 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359549 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359568 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359578 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359590 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359602 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359620 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359630 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.359642 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359651 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359800 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359819 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359832 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359848 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359907 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.359925 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375243 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375351 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375392 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375423 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375462 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375507 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375535 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.375571 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477150 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477202 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477242 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477291 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477321 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477335 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477354 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477427 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477460 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477479 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477498 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477518 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477536 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477554 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.477575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: I0122 09:05:40.608696 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:05:40 crc kubenswrapper[4681]: W0122 09:05:40.641343 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5ae4d12795a7790b874cbc9a7fa7032af0b14271b2cac08b2556020a1c0f3281 WatchSource:0}: Error finding container 5ae4d12795a7790b874cbc9a7fa7032af0b14271b2cac08b2556020a1c0f3281: Status 404 returned error can't find the container with id 5ae4d12795a7790b874cbc9a7fa7032af0b14271b2cac08b2556020a1c0f3281 Jan 22 09:05:40 crc kubenswrapper[4681]: E0122 09:05:40.644065 4681 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d0248dfddddb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,LastTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.087405 4681 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.087718 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.384623 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904"} Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.384707 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5ae4d12795a7790b874cbc9a7fa7032af0b14271b2cac08b2556020a1c0f3281"} Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.385239 4681 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.385536 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.387141 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388248 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388837 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57" exitCode=0 Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388862 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f" exitCode=0 Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388872 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525" exitCode=0 Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388880 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c" exitCode=2 Jan 22 09:05:41 crc kubenswrapper[4681]: I0122 09:05:41.388918 4681 scope.go:117] "RemoveContainer" containerID="281d12e11d304b5374e32fa76a0f626fb5cc72dcb259f1e6f63f186558d14faf" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.399120 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.844113 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.845638 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.846250 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.846593 4681 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.909977 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910097 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910107 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910274 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910300 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910378 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910581 4681 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910599 4681 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:42 crc kubenswrapper[4681]: I0122 09:05:42.910608 4681 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.416339 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.418796 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598" exitCode=0 Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.418852 4681 scope.go:117] "RemoveContainer" containerID="7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.418996 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.435188 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.435679 4681 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.447852 4681 scope.go:117] "RemoveContainer" containerID="c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.468053 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.483743 4681 scope.go:117] "RemoveContainer" containerID="cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.510448 4681 scope.go:117] "RemoveContainer" containerID="dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.532887 4681 scope.go:117] "RemoveContainer" containerID="c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.551605 4681 scope.go:117] "RemoveContainer" containerID="dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.575808 4681 scope.go:117] "RemoveContainer" containerID="7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.576202 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57\": container with ID starting with 7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57 not found: ID does not exist" containerID="7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.576276 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57"} err="failed to get container status \"7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57\": rpc error: code = NotFound desc = could not find container \"7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57\": container with ID starting with 7e1ce53f8261ad730e2a43b8c4841fd5e72e3ef3bd4b11f9f32378e3ddb65c57 not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.576361 4681 scope.go:117] "RemoveContainer" containerID="c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.576778 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f\": container with ID starting with c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f not found: ID does not exist" containerID="c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.576813 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f"} err="failed to get container status \"c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f\": rpc error: code = NotFound desc = could not find container \"c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f\": container with ID starting with c29c505da8ca30c818f5ba9818dea035254376f5739c0433c507a6dc7cf4698f not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.576833 4681 scope.go:117] "RemoveContainer" containerID="cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.577137 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525\": container with ID starting with cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525 not found: ID does not exist" containerID="cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.577171 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525"} err="failed to get container status \"cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525\": rpc error: code = NotFound desc = could not find container \"cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525\": container with ID starting with cb0db720088c0c3720217f6756481511add404cafce819ef1e9cfeeada8b6525 not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.577190 4681 scope.go:117] "RemoveContainer" containerID="dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.577537 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c\": container with ID starting with dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c not found: ID does not exist" containerID="dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.577610 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c"} err="failed to get container status \"dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c\": rpc error: code = NotFound desc = could not find container \"dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c\": container with ID starting with dca2254db0d824fcbf184a198ab5262e492c608a943a2c13c708afbbe9f7809c not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.577663 4681 scope.go:117] "RemoveContainer" containerID="c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.578032 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598\": container with ID starting with c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598 not found: ID does not exist" containerID="c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.578070 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598"} err="failed to get container status \"c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598\": rpc error: code = NotFound desc = could not find container \"c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598\": container with ID starting with c2a67c9a932172beaa283e3822140faec7d1075d261cf5baa5dfade505706598 not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.578092 4681 scope.go:117] "RemoveContainer" containerID="dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.578447 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\": container with ID starting with dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad not found: ID does not exist" containerID="dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad" Jan 22 09:05:43 crc kubenswrapper[4681]: I0122 09:05:43.578497 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad"} err="failed to get container status \"dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\": rpc error: code = NotFound desc = could not find container \"dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad\": container with ID starting with dee106e9b193951c3f506f0eb522812132719914b98cc7dbf60e1597311227ad not found: ID does not exist" Jan 22 09:05:43 crc kubenswrapper[4681]: E0122 09:05:43.777423 4681 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d0248dfddddb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,LastTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 09:05:45 crc kubenswrapper[4681]: I0122 09:05:45.444375 4681 generic.go:334] "Generic (PLEG): container finished" podID="da4558af-4f26-46a4-839c-a5d77f360bfc" containerID="b93de64843ecbaf485ac7a1abe2b7b3e69d53ee49df43c9a1fec60fcfa11c725" exitCode=0 Jan 22 09:05:45 crc kubenswrapper[4681]: I0122 09:05:45.444619 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"da4558af-4f26-46a4-839c-a5d77f360bfc","Type":"ContainerDied","Data":"b93de64843ecbaf485ac7a1abe2b7b3e69d53ee49df43c9a1fec60fcfa11c725"} Jan 22 09:05:45 crc kubenswrapper[4681]: I0122 09:05:45.445858 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:45 crc kubenswrapper[4681]: I0122 09:05:45.446440 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.875388 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.876608 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.877078 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.991575 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir\") pod \"da4558af-4f26-46a4-839c-a5d77f360bfc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.991710 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock\") pod \"da4558af-4f26-46a4-839c-a5d77f360bfc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.991843 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock" (OuterVolumeSpecName: "var-lock") pod "da4558af-4f26-46a4-839c-a5d77f360bfc" (UID: "da4558af-4f26-46a4-839c-a5d77f360bfc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.991751 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "da4558af-4f26-46a4-839c-a5d77f360bfc" (UID: "da4558af-4f26-46a4-839c-a5d77f360bfc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.991919 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access\") pod \"da4558af-4f26-46a4-839c-a5d77f360bfc\" (UID: \"da4558af-4f26-46a4-839c-a5d77f360bfc\") " Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.992639 4681 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:46 crc kubenswrapper[4681]: I0122 09:05:46.992675 4681 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/da4558af-4f26-46a4-839c-a5d77f360bfc-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.002773 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "da4558af-4f26-46a4-839c-a5d77f360bfc" (UID: "da4558af-4f26-46a4-839c-a5d77f360bfc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.093887 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da4558af-4f26-46a4-839c-a5d77f360bfc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.461579 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.465397 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"da4558af-4f26-46a4-839c-a5d77f360bfc","Type":"ContainerDied","Data":"d06a10d49a5f9f8b34953c342d4498df64f6d7e95ab951816477eb9591c773c8"} Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.465457 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d06a10d49a5f9f8b34953c342d4498df64f6d7e95ab951816477eb9591c773c8" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.490488 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:47 crc kubenswrapper[4681]: I0122 09:05:47.491358 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:49 crc kubenswrapper[4681]: I0122 09:05:49.457066 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:49 crc kubenswrapper[4681]: I0122 09:05:49.458141 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.316553 4681 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.317077 4681 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.317530 4681 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.317815 4681 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.318252 4681 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:50 crc kubenswrapper[4681]: I0122 09:05:50.318347 4681 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.318820 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="200ms" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.519571 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="400ms" Jan 22 09:05:50 crc kubenswrapper[4681]: E0122 09:05:50.921030 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="800ms" Jan 22 09:05:51 crc kubenswrapper[4681]: E0122 09:05:51.723346 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="1.6s" Jan 22 09:05:53 crc kubenswrapper[4681]: E0122 09:05:53.324386 4681 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.130:6443: connect: connection refused" interval="3.2s" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.452838 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.454478 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.454857 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.470847 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.470896 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:53 crc kubenswrapper[4681]: E0122 09:05:53.471626 4681 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:53 crc kubenswrapper[4681]: I0122 09:05:53.472360 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:53 crc kubenswrapper[4681]: E0122 09:05:53.778165 4681 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.130:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188d0248dfddddb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,LastTimestamp:2026-01-22 09:05:40.643233205 +0000 UTC m=+131.469143710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.512694 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.512949 4681 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="91b42d0d981cb9e194bf8ad0a8b945e348cde3f9ed52985d54c501b1768d7f8f" exitCode=1 Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.513035 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"91b42d0d981cb9e194bf8ad0a8b945e348cde3f9ed52985d54c501b1768d7f8f"} Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.513978 4681 scope.go:117] "RemoveContainer" containerID="91b42d0d981cb9e194bf8ad0a8b945e348cde3f9ed52985d54c501b1768d7f8f" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.514506 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.515048 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.515582 4681 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fda74804b67e603729ea3ce19fb366b68866a6e45c08f9b6fc5df2638dfc862c" exitCode=0 Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.515641 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fda74804b67e603729ea3ce19fb366b68866a6e45c08f9b6fc5df2638dfc862c"} Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.515686 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f1c7f6ea40056e6ee7276091ba9212f231c6bab2476dbc32b3bd3982b8fcd182"} Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.515760 4681 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.516032 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.516055 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.516360 4681 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:54 crc kubenswrapper[4681]: E0122 09:05:54.516605 4681 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.516867 4681 status_manager.go:851] "Failed to get status for pod" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:54 crc kubenswrapper[4681]: I0122 09:05:54.517373 4681 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.130:6443: connect: connection refused" Jan 22 09:05:55 crc kubenswrapper[4681]: I0122 09:05:55.524723 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d9efab5d9bb7983aa792d3a03a20e855e3ed811b98d947088ae583252b6b9c2"} Jan 22 09:05:55 crc kubenswrapper[4681]: I0122 09:05:55.525149 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f06abe8598d67f1d34226b7fe2d7ebd5b057fc3d15f5912aa85100d03f27d30"} Jan 22 09:05:55 crc kubenswrapper[4681]: I0122 09:05:55.525159 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f18622caed3d2b91a1eb562673426dd2da14ea547bb778bd3581b3d07e855b0"} Jan 22 09:05:55 crc kubenswrapper[4681]: I0122 09:05:55.529600 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 22 09:05:55 crc kubenswrapper[4681]: I0122 09:05:55.529668 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68ab992999df256363818391f7560d341689420f471dc3e4f3a223d9ba9fdd38"} Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.031973 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.032045 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.537903 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7660fdbc7f7c1d28ad0e83c0218d77f5ee8196c5a3430eed7897710b8f83560a"} Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.537966 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a9c20a1175c976d2deed10d351c6137b34c1519265dcd5ca45e4002d72104512"} Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.538095 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.538201 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:56 crc kubenswrapper[4681]: I0122 09:05:56.538225 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.123535 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.130892 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.472501 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.472572 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.478630 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:05:58 crc kubenswrapper[4681]: I0122 09:05:58.547885 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:05:59 crc kubenswrapper[4681]: I0122 09:05:59.482809 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" podUID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" containerName="oauth-openshift" containerID="cri-o://9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8" gracePeriod=15 Jan 22 09:05:59 crc kubenswrapper[4681]: I0122 09:05:59.975719 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.559015 4681 generic.go:334] "Generic (PLEG): container finished" podID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" containerID="9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8" exitCode=0 Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.559092 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.559126 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" event={"ID":"e3bd3021-b5e7-4c2c-8152-6f0450cea681","Type":"ContainerDied","Data":"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8"} Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.559619 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dmfxj" event={"ID":"e3bd3021-b5e7-4c2c-8152-6f0450cea681","Type":"ContainerDied","Data":"c910b5bb75ecacc968612d71a2d33e815b28083b719d764393e78e0a821a15a0"} Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.559644 4681 scope.go:117] "RemoveContainer" containerID="9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8" Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.579038 4681 scope.go:117] "RemoveContainer" containerID="9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8" Jan 22 09:06:00 crc kubenswrapper[4681]: E0122 09:06:00.579433 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8\": container with ID starting with 9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8 not found: ID does not exist" containerID="9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8" Jan 22 09:06:00 crc kubenswrapper[4681]: I0122 09:06:00.579482 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8"} err="failed to get container status \"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8\": rpc error: code = NotFound desc = could not find container \"9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8\": container with ID starting with 9162b0f9c6abc4a2f4899f6345381c5f95c25d0a9d458f34d67bfb53d23591d8 not found: ID does not exist" Jan 22 09:06:01 crc kubenswrapper[4681]: I0122 09:06:01.548726 4681 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.578023 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.578519 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.587214 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.590700 4681 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="23dbd6ec-5928-4610-9b72-fb4608418858" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594579 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594676 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594747 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594797 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594899 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.594952 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvbkv\" (UniqueName: \"kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.595016 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.595016 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.595063 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.595207 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596128 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596232 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596342 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596401 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596502 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert\") pod \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\" (UID: \"e3bd3021-b5e7-4c2c-8152-6f0450cea681\") " Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596413 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.596485 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.597129 4681 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.597166 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.597174 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.597193 4681 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e3bd3021-b5e7-4c2c-8152-6f0450cea681-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.597844 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.605308 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.605654 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv" (OuterVolumeSpecName: "kube-api-access-jvbkv") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "kube-api-access-jvbkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.605785 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.607544 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.607856 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.608495 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.609136 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.616519 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.616993 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e3bd3021-b5e7-4c2c-8152-6f0450cea681" (UID: "e3bd3021-b5e7-4c2c-8152-6f0450cea681"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698187 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698224 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvbkv\" (UniqueName: \"kubernetes.io/projected/e3bd3021-b5e7-4c2c-8152-6f0450cea681-kube-api-access-jvbkv\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698241 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698254 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698294 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698310 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698325 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698340 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698353 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698366 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:02 crc kubenswrapper[4681]: I0122 09:06:02.698378 4681 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e3bd3021-b5e7-4c2c-8152-6f0450cea681-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:03 crc kubenswrapper[4681]: I0122 09:06:03.585574 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:03 crc kubenswrapper[4681]: I0122 09:06:03.585630 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:09 crc kubenswrapper[4681]: I0122 09:06:09.470130 4681 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="23dbd6ec-5928-4610-9b72-fb4608418858" Jan 22 09:06:10 crc kubenswrapper[4681]: I0122 09:06:10.572845 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 22 09:06:10 crc kubenswrapper[4681]: I0122 09:06:10.826546 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 22 09:06:11 crc kubenswrapper[4681]: I0122 09:06:11.516744 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 22 09:06:11 crc kubenswrapper[4681]: I0122 09:06:11.618532 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 22 09:06:11 crc kubenswrapper[4681]: I0122 09:06:11.799370 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 22 09:06:11 crc kubenswrapper[4681]: I0122 09:06:11.814435 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 22 09:06:11 crc kubenswrapper[4681]: I0122 09:06:11.873013 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 22 09:06:12 crc kubenswrapper[4681]: I0122 09:06:12.419483 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 22 09:06:12 crc kubenswrapper[4681]: I0122 09:06:12.449119 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 22 09:06:12 crc kubenswrapper[4681]: I0122 09:06:12.470150 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 22 09:06:12 crc kubenswrapper[4681]: I0122 09:06:12.888887 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 22 09:06:12 crc kubenswrapper[4681]: I0122 09:06:12.989877 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.084422 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.089298 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.581410 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.650437 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.879177 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 22 09:06:13 crc kubenswrapper[4681]: I0122 09:06:13.922111 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.087044 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.157578 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.554081 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.646047 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.725406 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.819658 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.843940 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.846684 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.862380 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.919610 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.966055 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 22 09:06:14 crc kubenswrapper[4681]: I0122 09:06:14.975331 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.040396 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.080226 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.165915 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.258528 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.265543 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.273229 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.359934 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.380157 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.453151 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.554186 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.600683 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.679712 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.716718 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.766132 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.849487 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.924058 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.951452 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 22 09:06:15 crc kubenswrapper[4681]: I0122 09:06:15.951569 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.009323 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.018446 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.030399 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.100151 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.141690 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.155953 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.223158 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.250311 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.357235 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.391082 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.474405 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.525050 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.686352 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.767746 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.771474 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.771754 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.811837 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 22 09:06:16 crc kubenswrapper[4681]: I0122 09:06:16.891803 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.056139 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.137750 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.269411 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.295605 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.464054 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.549411 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.562816 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.609427 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.611953 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.671463 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.707848 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.752743 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.903890 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 22 09:06:17 crc kubenswrapper[4681]: I0122 09:06:17.944241 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.010858 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.010968 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.015490 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.022101 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.115030 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.127340 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.192712 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.243156 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.308345 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.377030 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.419144 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.510221 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.646154 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.672086 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.698835 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.754642 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.782657 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.891858 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.931124 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.963139 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 22 09:06:18 crc kubenswrapper[4681]: I0122 09:06:18.994703 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.172770 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.188131 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.239756 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.249695 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.265467 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.361793 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.410636 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.612580 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.642773 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.658702 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.862755 4681 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.961531 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.965077 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 22 09:06:19 crc kubenswrapper[4681]: I0122 09:06:19.996822 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.030318 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.069141 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.173747 4681 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.272774 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.304346 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.420708 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.441645 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.473210 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.537244 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.661784 4681 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.701008 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.725064 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.737087 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.905596 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.908443 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 22 09:06:20 crc kubenswrapper[4681]: I0122 09:06:20.926097 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.064488 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.077518 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.086925 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.100500 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.114718 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.190852 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.195457 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.245171 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.281717 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.291115 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.308668 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.535534 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.539321 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.571255 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.639962 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.681523 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.716225 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.753891 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.790614 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.839718 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.872782 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.874864 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 22 09:06:21 crc kubenswrapper[4681]: I0122 09:06:21.981010 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.116909 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.197239 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.213885 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.267007 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.355828 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.647800 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.654482 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.679211 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.711184 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.715948 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.726190 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.914840 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.932191 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.974188 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 22 09:06:22 crc kubenswrapper[4681]: I0122 09:06:22.992528 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.101622 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.131505 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.164123 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.177804 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.181090 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.289397 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.332870 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.470034 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.523818 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.546094 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.597845 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.661123 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.684959 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.733468 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.807222 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.890227 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 22 09:06:23 crc kubenswrapper[4681]: I0122 09:06:23.968061 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.074027 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.079841 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.106553 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.163386 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.321249 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.335831 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.338162 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.422946 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.448213 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.472133 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.546753 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.547160 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.550837 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.662726 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.666871 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.700676 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.811860 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 22 09:06:24 crc kubenswrapper[4681]: I0122 09:06:24.972912 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.102324 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.154668 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.243878 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.259171 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.394444 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.478978 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.610747 4681 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.636617 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.672134 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.699602 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.738609 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.767007 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.810179 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.835210 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.841368 4681 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.950554 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.956225 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 22 09:06:25 crc kubenswrapper[4681]: I0122 09:06:25.987535 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.011369 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.045707 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.045774 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.257645 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.508675 4681 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.510229 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=46.510210022 podStartE2EDuration="46.510210022s" podCreationTimestamp="2026-01-22 09:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:01.338410144 +0000 UTC m=+152.164320659" watchObservedRunningTime="2026-01-22 09:06:26.510210022 +0000 UTC m=+177.336120537" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.515825 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dmfxj","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.515886 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw","openshift-kube-apiserver/kube-apiserver-crc"] Jan 22 09:06:26 crc kubenswrapper[4681]: E0122 09:06:26.516077 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" containerName="oauth-openshift" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516090 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" containerName="oauth-openshift" Jan 22 09:06:26 crc kubenswrapper[4681]: E0122 09:06:26.516105 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" containerName="installer" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516112 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" containerName="installer" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516450 4681 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516481 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9db32b89-90cc-4e93-916a-257088ca3c23" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516629 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" containerName="oauth-openshift" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.516682 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4558af-4f26-46a4-839c-a5d77f360bfc" containerName="installer" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.517451 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.522051 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.522649 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.522976 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.523327 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.524878 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525126 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525139 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525416 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525635 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525688 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.525756 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.526047 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.526473 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.538059 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.541828 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.552253 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.554554 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.554536469 podStartE2EDuration="25.554536469s" podCreationTimestamp="2026-01-22 09:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:26.552839855 +0000 UTC m=+177.378750370" watchObservedRunningTime="2026-01-22 09:06:26.554536469 +0000 UTC m=+177.380446984" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.575925 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.582513 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657414 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657464 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657494 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657522 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657553 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657682 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657778 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657802 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbkx\" (UniqueName: \"kubernetes.io/projected/d030a8a5-b0f3-4182-a128-9ccdf25f506e-kube-api-access-4xbkx\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657896 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.657983 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.658026 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.658061 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.658223 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.658309 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760034 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760146 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760186 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760295 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760331 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760378 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760453 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760482 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbkx\" (UniqueName: \"kubernetes.io/projected/d030a8a5-b0f3-4182-a128-9ccdf25f506e-kube-api-access-4xbkx\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.760515 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.761759 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-policies\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.761796 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.761996 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d030a8a5-b0f3-4182-a128-9ccdf25f506e-audit-dir\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.762181 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.763041 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.763106 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.763150 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.763239 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.764487 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.769402 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.769484 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.770431 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.770433 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.771020 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-system-session\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.771031 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.772725 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.773994 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d030a8a5-b0f3-4182-a128-9ccdf25f506e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.795436 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbkx\" (UniqueName: \"kubernetes.io/projected/d030a8a5-b0f3-4182-a128-9ccdf25f506e-kube-api-access-4xbkx\") pod \"oauth-openshift-6cc7c68bbf-d2dfw\" (UID: \"d030a8a5-b0f3-4182-a128-9ccdf25f506e\") " pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.843597 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.946732 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw"] Jan 22 09:06:26 crc kubenswrapper[4681]: I0122 09:06:26.989456 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.061332 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.103404 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.306916 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.354919 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw"] Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.441065 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.462048 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3bd3021-b5e7-4c2c-8152-6f0450cea681" path="/var/lib/kubelet/pods/e3bd3021-b5e7-4c2c-8152-6f0450cea681/volumes" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.797380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" event={"ID":"d030a8a5-b0f3-4182-a128-9ccdf25f506e","Type":"ContainerStarted","Data":"14bed61f0ba1c954ee47c4f77ff7a173d990e1b9e3c567674c92caec67c45bcc"} Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.797739 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.797764 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" event={"ID":"d030a8a5-b0f3-4182-a128-9ccdf25f506e","Type":"ContainerStarted","Data":"ac712f635f811e7fc753afa91ad86d2f9b7f933ebd070b789b3eca1642f605dd"} Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.807739 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.823845 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" podStartSLOduration=53.823828902 podStartE2EDuration="53.823828902s" podCreationTimestamp="2026-01-22 09:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:27.822238191 +0000 UTC m=+178.648148736" watchObservedRunningTime="2026-01-22 09:06:27.823828902 +0000 UTC m=+178.649739407" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.879083 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.929819 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 22 09:06:27 crc kubenswrapper[4681]: I0122 09:06:27.997870 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 22 09:06:28 crc kubenswrapper[4681]: I0122 09:06:28.050509 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cc7c68bbf-d2dfw" Jan 22 09:06:28 crc kubenswrapper[4681]: I0122 09:06:28.639414 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 22 09:06:30 crc kubenswrapper[4681]: I0122 09:06:30.112340 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.045350 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.045787 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" podUID="88771d4a-dace-45a4-bd87-9f25aaa52f50" containerName="controller-manager" containerID="cri-o://3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c" gracePeriod=30 Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.142101 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.142342 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" podUID="045520f1-ee73-440d-84a0-ebc935904fa2" containerName="route-controller-manager" containerID="cri-o://5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217" gracePeriod=30 Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.510384 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.518921 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.671534 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca\") pod \"88771d4a-dace-45a4-bd87-9f25aaa52f50\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.671746 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert\") pod \"045520f1-ee73-440d-84a0-ebc935904fa2\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.671987 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles\") pod \"88771d4a-dace-45a4-bd87-9f25aaa52f50\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672068 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config\") pod \"88771d4a-dace-45a4-bd87-9f25aaa52f50\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672168 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert\") pod \"88771d4a-dace-45a4-bd87-9f25aaa52f50\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672218 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config\") pod \"045520f1-ee73-440d-84a0-ebc935904fa2\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672347 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca\") pod \"045520f1-ee73-440d-84a0-ebc935904fa2\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672471 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "88771d4a-dace-45a4-bd87-9f25aaa52f50" (UID: "88771d4a-dace-45a4-bd87-9f25aaa52f50"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672502 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config" (OuterVolumeSpecName: "config") pod "88771d4a-dace-45a4-bd87-9f25aaa52f50" (UID: "88771d4a-dace-45a4-bd87-9f25aaa52f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.672576 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca" (OuterVolumeSpecName: "client-ca") pod "88771d4a-dace-45a4-bd87-9f25aaa52f50" (UID: "88771d4a-dace-45a4-bd87-9f25aaa52f50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.673069 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca" (OuterVolumeSpecName: "client-ca") pod "045520f1-ee73-440d-84a0-ebc935904fa2" (UID: "045520f1-ee73-440d-84a0-ebc935904fa2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.673428 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config" (OuterVolumeSpecName: "config") pod "045520f1-ee73-440d-84a0-ebc935904fa2" (UID: "045520f1-ee73-440d-84a0-ebc935904fa2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.673788 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42sfl\" (UniqueName: \"kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl\") pod \"045520f1-ee73-440d-84a0-ebc935904fa2\" (UID: \"045520f1-ee73-440d-84a0-ebc935904fa2\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.673864 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72bgr\" (UniqueName: \"kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr\") pod \"88771d4a-dace-45a4-bd87-9f25aaa52f50\" (UID: \"88771d4a-dace-45a4-bd87-9f25aaa52f50\") " Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.674636 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.674684 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.674703 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.674725 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88771d4a-dace-45a4-bd87-9f25aaa52f50-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.674742 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045520f1-ee73-440d-84a0-ebc935904fa2-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.678385 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl" (OuterVolumeSpecName: "kube-api-access-42sfl") pod "045520f1-ee73-440d-84a0-ebc935904fa2" (UID: "045520f1-ee73-440d-84a0-ebc935904fa2"). InnerVolumeSpecName "kube-api-access-42sfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.678708 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr" (OuterVolumeSpecName: "kube-api-access-72bgr") pod "88771d4a-dace-45a4-bd87-9f25aaa52f50" (UID: "88771d4a-dace-45a4-bd87-9f25aaa52f50"). InnerVolumeSpecName "kube-api-access-72bgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.679148 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "045520f1-ee73-440d-84a0-ebc935904fa2" (UID: "045520f1-ee73-440d-84a0-ebc935904fa2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.679973 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "88771d4a-dace-45a4-bd87-9f25aaa52f50" (UID: "88771d4a-dace-45a4-bd87-9f25aaa52f50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.775967 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/045520f1-ee73-440d-84a0-ebc935904fa2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.776298 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88771d4a-dace-45a4-bd87-9f25aaa52f50-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.776315 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42sfl\" (UniqueName: \"kubernetes.io/projected/045520f1-ee73-440d-84a0-ebc935904fa2-kube-api-access-42sfl\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.776333 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72bgr\" (UniqueName: \"kubernetes.io/projected/88771d4a-dace-45a4-bd87-9f25aaa52f50-kube-api-access-72bgr\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.837240 4681 generic.go:334] "Generic (PLEG): container finished" podID="88771d4a-dace-45a4-bd87-9f25aaa52f50" containerID="3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c" exitCode=0 Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.837314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" event={"ID":"88771d4a-dace-45a4-bd87-9f25aaa52f50","Type":"ContainerDied","Data":"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c"} Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.837360 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.837388 4681 scope.go:117] "RemoveContainer" containerID="3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.837374 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm" event={"ID":"88771d4a-dace-45a4-bd87-9f25aaa52f50","Type":"ContainerDied","Data":"3261ea3c67323766572033641e11e9d0a1a130754a03fea23a58d53b744ce697"} Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.842655 4681 generic.go:334] "Generic (PLEG): container finished" podID="045520f1-ee73-440d-84a0-ebc935904fa2" containerID="5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217" exitCode=0 Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.842702 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" event={"ID":"045520f1-ee73-440d-84a0-ebc935904fa2","Type":"ContainerDied","Data":"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217"} Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.842729 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.842749 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg" event={"ID":"045520f1-ee73-440d-84a0-ebc935904fa2","Type":"ContainerDied","Data":"b85d5ae4c3ab46ccc79e5ecdb279694cad49b61e94fa94dba10bb842e66ff771"} Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.861196 4681 scope.go:117] "RemoveContainer" containerID="3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c" Jan 22 09:06:33 crc kubenswrapper[4681]: E0122 09:06:33.861991 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c\": container with ID starting with 3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c not found: ID does not exist" containerID="3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.862060 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c"} err="failed to get container status \"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c\": rpc error: code = NotFound desc = could not find container \"3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c\": container with ID starting with 3f888cc05b4d35680900b9af6486d11863bd0459055406849558083f1ac58f9c not found: ID does not exist" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.862105 4681 scope.go:117] "RemoveContainer" containerID="5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.874489 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.879226 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7664b4d6d8-f5xfm"] Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.892459 4681 scope.go:117] "RemoveContainer" containerID="5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217" Jan 22 09:06:33 crc kubenswrapper[4681]: E0122 09:06:33.892973 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217\": container with ID starting with 5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217 not found: ID does not exist" containerID="5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.893020 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217"} err="failed to get container status \"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217\": rpc error: code = NotFound desc = could not find container \"5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217\": container with ID starting with 5f15ca711c9684978a4c19fa786fb286ea14f501f4928685b1c8e253ba891217 not found: ID does not exist" Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.893826 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:06:33 crc kubenswrapper[4681]: I0122 09:06:33.897589 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b8b4b6776-gjrdg"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.010386 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:35 crc kubenswrapper[4681]: E0122 09:06:35.010858 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045520f1-ee73-440d-84a0-ebc935904fa2" containerName="route-controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.010877 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="045520f1-ee73-440d-84a0-ebc935904fa2" containerName="route-controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: E0122 09:06:35.010892 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88771d4a-dace-45a4-bd87-9f25aaa52f50" containerName="controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.010903 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="88771d4a-dace-45a4-bd87-9f25aaa52f50" containerName="controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.014894 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="88771d4a-dace-45a4-bd87-9f25aaa52f50" containerName="controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.014930 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="045520f1-ee73-440d-84a0-ebc935904fa2" containerName="route-controller-manager" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.015405 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.016482 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.017511 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.019518 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.019705 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.026657 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.026753 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.026869 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.027387 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.033704 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.033704 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.033871 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.034846 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.035216 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.036503 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.036549 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.040707 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.054213 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091576 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091708 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091810 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091870 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091917 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k9jn\" (UniqueName: \"kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.091955 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkgk\" (UniqueName: \"kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.092024 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.092055 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193621 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193699 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193787 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193848 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193904 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193951 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k9jn\" (UniqueName: \"kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.193992 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkgk\" (UniqueName: \"kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.194035 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.194069 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.195955 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.195961 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.196324 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.196842 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.197403 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.200694 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.205207 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.219588 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkgk\" (UniqueName: \"kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk\") pod \"controller-manager-64996fbdb-zh4xs\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.225380 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k9jn\" (UniqueName: \"kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn\") pod \"route-controller-manager-96df84589-8t4vx\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.347540 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.361177 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.435695 4681 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.436489 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904" gracePeriod=5 Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.462417 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045520f1-ee73-440d-84a0-ebc935904fa2" path="/var/lib/kubelet/pods/045520f1-ee73-440d-84a0-ebc935904fa2/volumes" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.463250 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88771d4a-dace-45a4-bd87-9f25aaa52f50" path="/var/lib/kubelet/pods/88771d4a-dace-45a4-bd87-9f25aaa52f50/volumes" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.618782 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.861222 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" event={"ID":"1e547b1a-19db-4893-823c-945fcbc64975","Type":"ContainerStarted","Data":"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7"} Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.861570 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" event={"ID":"1e547b1a-19db-4893-823c-945fcbc64975","Type":"ContainerStarted","Data":"ad2039191bdda81e076f34bc6f6cebe045f56f1f4e5154b259c8e1337df41b71"} Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.863051 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.911223 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.916676 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" podStartSLOduration=2.916660787 podStartE2EDuration="2.916660787s" podCreationTimestamp="2026-01-22 09:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:35.915476876 +0000 UTC m=+186.741387391" watchObservedRunningTime="2026-01-22 09:06:35.916660787 +0000 UTC m=+186.742571292" Jan 22 09:06:35 crc kubenswrapper[4681]: I0122 09:06:35.943938 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:36 crc kubenswrapper[4681]: I0122 09:06:36.868923 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" event={"ID":"cc8edd67-e844-4fc8-af89-92b332bceea1","Type":"ContainerStarted","Data":"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda"} Jan 22 09:06:36 crc kubenswrapper[4681]: I0122 09:06:36.869250 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:36 crc kubenswrapper[4681]: I0122 09:06:36.869278 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" event={"ID":"cc8edd67-e844-4fc8-af89-92b332bceea1","Type":"ContainerStarted","Data":"7ff8c79c9468ada3fba40aa46d504cf0642c3b07f15826a7aed1c218275fafd0"} Jan 22 09:06:36 crc kubenswrapper[4681]: I0122 09:06:36.878089 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:36 crc kubenswrapper[4681]: I0122 09:06:36.908344 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" podStartSLOduration=3.908320065 podStartE2EDuration="3.908320065s" podCreationTimestamp="2026-01-22 09:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:36.889315984 +0000 UTC m=+187.715226489" watchObservedRunningTime="2026-01-22 09:06:36.908320065 +0000 UTC m=+187.734230580" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.285905 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.287311 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8d59" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="registry-server" containerID="cri-o://12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118" gracePeriod=30 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.306219 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.306718 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sdgpn" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="registry-server" containerID="cri-o://3c5465d5080b6a5c95965d29f8d5821cd64dcddf225d1fff97bf768940c1dc13" gracePeriod=30 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.317315 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.317538 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" containerName="marketplace-operator" containerID="cri-o://969ce50d2f970fec1c3be7a46c6510eb0fe8554ad7fce758265e3693c1bc7dc9" gracePeriod=30 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.331573 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.331970 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44cjz" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="registry-server" containerID="cri-o://36340238d503d3c6649d5cb6b1c5b227c3f92280c661d33d4e928198107e86c0" gracePeriod=30 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.335604 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.335844 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pz5k" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="registry-server" containerID="cri-o://26347e5fe28e5a546dd4dcf5960db288addc2253d6e3f89b83e345f634e16c6a" gracePeriod=30 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.339015 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zf5vt"] Jan 22 09:06:40 crc kubenswrapper[4681]: E0122 09:06:40.339411 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.339429 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.339536 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.339926 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.352629 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zf5vt"] Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.470954 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.471136 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcxc\" (UniqueName: \"kubernetes.io/projected/45a25363-c0b4-4fdb-a773-fc99c6653bbb-kube-api-access-7gcxc\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.471172 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.572654 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.572737 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcxc\" (UniqueName: \"kubernetes.io/projected/45a25363-c0b4-4fdb-a773-fc99c6653bbb-kube-api-access-7gcxc\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.572795 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.574038 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.586140 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/45a25363-c0b4-4fdb-a773-fc99c6653bbb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.594118 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcxc\" (UniqueName: \"kubernetes.io/projected/45a25363-c0b4-4fdb-a773-fc99c6653bbb-kube-api-access-7gcxc\") pod \"marketplace-operator-79b997595-zf5vt\" (UID: \"45a25363-c0b4-4fdb-a773-fc99c6653bbb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.681174 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.846382 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.846759 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.872957 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.905275 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.905325 4681 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904" exitCode=137 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.905440 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.905677 4681 scope.go:117] "RemoveContainer" containerID="7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.915439 4681 generic.go:334] "Generic (PLEG): container finished" podID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerID="3c5465d5080b6a5c95965d29f8d5821cd64dcddf225d1fff97bf768940c1dc13" exitCode=0 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.915543 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerDied","Data":"3c5465d5080b6a5c95965d29f8d5821cd64dcddf225d1fff97bf768940c1dc13"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.922989 4681 generic.go:334] "Generic (PLEG): container finished" podID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerID="12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118" exitCode=0 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.923046 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8d59" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.923124 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerDied","Data":"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.923154 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8d59" event={"ID":"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9","Type":"ContainerDied","Data":"24b1f562282018edd88b319cbe1521587f1c76759f1108ec5b279208494ff181"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.931099 4681 scope.go:117] "RemoveContainer" containerID="7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.931175 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:06:40 crc kubenswrapper[4681]: E0122 09:06:40.931670 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904\": container with ID starting with 7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904 not found: ID does not exist" containerID="7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.931714 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904"} err="failed to get container status \"7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904\": rpc error: code = NotFound desc = could not find container \"7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904\": container with ID starting with 7281228eb38806208dedd0dd3b44252085b88b9435ef7af19a225f474bc59904 not found: ID does not exist" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.931739 4681 scope.go:117] "RemoveContainer" containerID="12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.932512 4681 generic.go:334] "Generic (PLEG): container finished" podID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerID="26347e5fe28e5a546dd4dcf5960db288addc2253d6e3f89b83e345f634e16c6a" exitCode=0 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.932594 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerDied","Data":"26347e5fe28e5a546dd4dcf5960db288addc2253d6e3f89b83e345f634e16c6a"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.937582 4681 generic.go:334] "Generic (PLEG): container finished" podID="6d75b145-9547-49b4-9aea-652ea33cb371" containerID="969ce50d2f970fec1c3be7a46c6510eb0fe8554ad7fce758265e3693c1bc7dc9" exitCode=0 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.937646 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.937764 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhf7x" event={"ID":"6d75b145-9547-49b4-9aea-652ea33cb371","Type":"ContainerDied","Data":"969ce50d2f970fec1c3be7a46c6510eb0fe8554ad7fce758265e3693c1bc7dc9"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.950071 4681 generic.go:334] "Generic (PLEG): container finished" podID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerID="36340238d503d3c6649d5cb6b1c5b227c3f92280c661d33d4e928198107e86c0" exitCode=0 Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.950116 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerDied","Data":"36340238d503d3c6649d5cb6b1c5b227c3f92280c661d33d4e928198107e86c0"} Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.954068 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.960349 4681 scope.go:117] "RemoveContainer" containerID="b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.964920 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.965554 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977122 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k9q4\" (UniqueName: \"kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4\") pod \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977179 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977207 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977277 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977342 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content\") pod \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977415 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities\") pod \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\" (UID: \"0e85041c-4d16-4a52-ae78-e3dc2d1e81f9\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977438 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977478 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.977685 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.978431 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.978490 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.978466 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.978573 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities" (OuterVolumeSpecName: "utilities") pod "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" (UID: "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.979903 4681 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.979932 4681 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.979941 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.979975 4681 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.979984 4681 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.988326 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4" (OuterVolumeSpecName: "kube-api-access-9k9q4") pod "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" (UID: "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9"). InnerVolumeSpecName "kube-api-access-9k9q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:40 crc kubenswrapper[4681]: I0122 09:06:40.988338 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.006551 4681 scope.go:117] "RemoveContainer" containerID="60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.033991 4681 scope.go:117] "RemoveContainer" containerID="12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118" Jan 22 09:06:41 crc kubenswrapper[4681]: E0122 09:06:41.034423 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118\": container with ID starting with 12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118 not found: ID does not exist" containerID="12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.034454 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118"} err="failed to get container status \"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118\": rpc error: code = NotFound desc = could not find container \"12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118\": container with ID starting with 12e348d2516c3e7a309daf85738a13d0530084aa3080257d2ecf7afc44c8b118 not found: ID does not exist" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.034484 4681 scope.go:117] "RemoveContainer" containerID="b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75" Jan 22 09:06:41 crc kubenswrapper[4681]: E0122 09:06:41.035138 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75\": container with ID starting with b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75 not found: ID does not exist" containerID="b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.035160 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75"} err="failed to get container status \"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75\": rpc error: code = NotFound desc = could not find container \"b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75\": container with ID starting with b85bdabbac0af878eb15edbab5c9523761429af834748ce6a0409c831da5ea75 not found: ID does not exist" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.035175 4681 scope.go:117] "RemoveContainer" containerID="60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18" Jan 22 09:06:41 crc kubenswrapper[4681]: E0122 09:06:41.035403 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18\": container with ID starting with 60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18 not found: ID does not exist" containerID="60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.035425 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18"} err="failed to get container status \"60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18\": rpc error: code = NotFound desc = could not find container \"60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18\": container with ID starting with 60be4201fccf16e6917af06144a5880c31de151e131366661a6b1d35c4c29b18 not found: ID does not exist" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.035439 4681 scope.go:117] "RemoveContainer" containerID="969ce50d2f970fec1c3be7a46c6510eb0fe8554ad7fce758265e3693c1bc7dc9" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.042084 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" (UID: "0e85041c-4d16-4a52-ae78-e3dc2d1e81f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.080844 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics\") pod \"6d75b145-9547-49b4-9aea-652ea33cb371\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.080902 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds2zh\" (UniqueName: \"kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh\") pod \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.080938 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities\") pod \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.080975 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca\") pod \"6d75b145-9547-49b4-9aea-652ea33cb371\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.080998 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lfs\" (UniqueName: \"kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs\") pod \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081018 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whjzk\" (UniqueName: \"kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk\") pod \"6d75b145-9547-49b4-9aea-652ea33cb371\" (UID: \"6d75b145-9547-49b4-9aea-652ea33cb371\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081038 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content\") pod \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\" (UID: \"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081064 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content\") pod \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081098 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities\") pod \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081125 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content\") pod \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081150 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities\") pod \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\" (UID: \"49ca3012-b9cb-46cd-b37c-4a74472c3fef\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081171 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4dj8\" (UniqueName: \"kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8\") pod \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\" (UID: \"cafe2ee6-7f62-4d78-8e7e-de58c8506696\") " Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081376 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k9q4\" (UniqueName: \"kubernetes.io/projected/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-kube-api-access-9k9q4\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081387 4681 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.081396 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.082657 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities" (OuterVolumeSpecName: "utilities") pod "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" (UID: "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.082856 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6d75b145-9547-49b4-9aea-652ea33cb371" (UID: "6d75b145-9547-49b4-9aea-652ea33cb371"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.082978 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities" (OuterVolumeSpecName: "utilities") pod "cafe2ee6-7f62-4d78-8e7e-de58c8506696" (UID: "cafe2ee6-7f62-4d78-8e7e-de58c8506696"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.084015 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities" (OuterVolumeSpecName: "utilities") pod "49ca3012-b9cb-46cd-b37c-4a74472c3fef" (UID: "49ca3012-b9cb-46cd-b37c-4a74472c3fef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.084630 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk" (OuterVolumeSpecName: "kube-api-access-whjzk") pod "6d75b145-9547-49b4-9aea-652ea33cb371" (UID: "6d75b145-9547-49b4-9aea-652ea33cb371"). InnerVolumeSpecName "kube-api-access-whjzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.085065 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh" (OuterVolumeSpecName: "kube-api-access-ds2zh") pod "49ca3012-b9cb-46cd-b37c-4a74472c3fef" (UID: "49ca3012-b9cb-46cd-b37c-4a74472c3fef"). InnerVolumeSpecName "kube-api-access-ds2zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.085248 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6d75b145-9547-49b4-9aea-652ea33cb371" (UID: "6d75b145-9547-49b4-9aea-652ea33cb371"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.088282 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8" (OuterVolumeSpecName: "kube-api-access-g4dj8") pod "cafe2ee6-7f62-4d78-8e7e-de58c8506696" (UID: "cafe2ee6-7f62-4d78-8e7e-de58c8506696"). InnerVolumeSpecName "kube-api-access-g4dj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.090207 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs" (OuterVolumeSpecName: "kube-api-access-99lfs") pod "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" (UID: "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b"). InnerVolumeSpecName "kube-api-access-99lfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.108005 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49ca3012-b9cb-46cd-b37c-4a74472c3fef" (UID: "49ca3012-b9cb-46cd-b37c-4a74472c3fef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.142592 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" (UID: "9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182510 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds2zh\" (UniqueName: \"kubernetes.io/projected/49ca3012-b9cb-46cd-b37c-4a74472c3fef-kube-api-access-ds2zh\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182564 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182588 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182607 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lfs\" (UniqueName: \"kubernetes.io/projected/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-kube-api-access-99lfs\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182627 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whjzk\" (UniqueName: \"kubernetes.io/projected/6d75b145-9547-49b4-9aea-652ea33cb371-kube-api-access-whjzk\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182646 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182665 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182684 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182701 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49ca3012-b9cb-46cd-b37c-4a74472c3fef-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182719 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4dj8\" (UniqueName: \"kubernetes.io/projected/cafe2ee6-7f62-4d78-8e7e-de58c8506696-kube-api-access-g4dj8\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.182737 4681 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d75b145-9547-49b4-9aea-652ea33cb371-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.194493 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zf5vt"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.222718 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cafe2ee6-7f62-4d78-8e7e-de58c8506696" (UID: "cafe2ee6-7f62-4d78-8e7e-de58c8506696"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.259159 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.262016 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8d59"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.284083 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafe2ee6-7f62-4d78-8e7e-de58c8506696-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.288039 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.291476 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhf7x"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.460991 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" path="/var/lib/kubelet/pods/0e85041c-4d16-4a52-ae78-e3dc2d1e81f9/volumes" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.462326 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" path="/var/lib/kubelet/pods/6d75b145-9547-49b4-9aea-652ea33cb371/volumes" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.462774 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.463032 4681 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.506870 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.506948 4681 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ef6936ae-eea7-4a41-a03b-cbfa5196a445" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.520456 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.520511 4681 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ef6936ae-eea7-4a41-a03b-cbfa5196a445" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.958805 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" event={"ID":"45a25363-c0b4-4fdb-a773-fc99c6653bbb","Type":"ContainerStarted","Data":"74a0351bd7d4945c59830f206665b9ddfec03fcb7ab6ed441a841882ea8b47e7"} Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.958880 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" event={"ID":"45a25363-c0b4-4fdb-a773-fc99c6653bbb","Type":"ContainerStarted","Data":"c7ba8d47fd54bf495a4cc187263d2a75299d1fdebb3a512fd284ae9cfe82df57"} Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.958912 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.962134 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44cjz" event={"ID":"49ca3012-b9cb-46cd-b37c-4a74472c3fef","Type":"ContainerDied","Data":"024e8074716c7c0bc480ab3b7b2b99d8b6f47cc6869a5a5a84b83b14231e6cb1"} Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.962205 4681 scope.go:117] "RemoveContainer" containerID="36340238d503d3c6649d5cb6b1c5b227c3f92280c661d33d4e928198107e86c0" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.962332 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44cjz" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.962839 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.967706 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sdgpn" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.967828 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sdgpn" event={"ID":"9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b","Type":"ContainerDied","Data":"3c79829b904c53898bc5fb946f2ad2cf413a3cb499b7f4a06b9173f9cbd13b8d"} Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.973004 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pz5k" event={"ID":"cafe2ee6-7f62-4d78-8e7e-de58c8506696","Type":"ContainerDied","Data":"675b381a73b1a3b03a92dcaf8fc504071068e5d2cc78036b77fc7a06da7043e6"} Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.973398 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pz5k" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.979499 4681 scope.go:117] "RemoveContainer" containerID="1598ce951490d8df05df022a77b7d8e4252d3ae0ac7f7ecd44f1d05fdcff4871" Jan 22 09:06:41 crc kubenswrapper[4681]: I0122 09:06:41.989272 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zf5vt" podStartSLOduration=1.989210025 podStartE2EDuration="1.989210025s" podCreationTimestamp="2026-01-22 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:41.982500031 +0000 UTC m=+192.808410546" watchObservedRunningTime="2026-01-22 09:06:41.989210025 +0000 UTC m=+192.815120540" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.001072 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.008692 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44cjz"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.009581 4681 scope.go:117] "RemoveContainer" containerID="92da1bf799b307618104f2c868856efa6120d74df5cbe2b9abb792056a6cc91f" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.025347 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.038512 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sdgpn"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.046503 4681 scope.go:117] "RemoveContainer" containerID="3c5465d5080b6a5c95965d29f8d5821cd64dcddf225d1fff97bf768940c1dc13" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.050469 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.053714 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pz5k"] Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.078476 4681 scope.go:117] "RemoveContainer" containerID="ecb7a618b93641938749213262354966dd5f0d743ae93d0dce6e50ed4cae4692" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.105342 4681 scope.go:117] "RemoveContainer" containerID="b4d6e8042e842996a16e1d6d91890befae5d24b50a5af4eb646e7772ea9047d0" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.129186 4681 scope.go:117] "RemoveContainer" containerID="26347e5fe28e5a546dd4dcf5960db288addc2253d6e3f89b83e345f634e16c6a" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.151751 4681 scope.go:117] "RemoveContainer" containerID="b28ff44948658b8cc9c396d6bb4d7bf57abe59549fbf2acba14b0feea55950cd" Jan 22 09:06:42 crc kubenswrapper[4681]: I0122 09:06:42.175491 4681 scope.go:117] "RemoveContainer" containerID="fb5f705aa809ba1e83dbbeaa46b90ceb50b0b856ab810ee9a5e6bbf337a9e2f2" Jan 22 09:06:43 crc kubenswrapper[4681]: I0122 09:06:43.468035 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" path="/var/lib/kubelet/pods/49ca3012-b9cb-46cd-b37c-4a74472c3fef/volumes" Jan 22 09:06:43 crc kubenswrapper[4681]: I0122 09:06:43.469480 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" path="/var/lib/kubelet/pods/9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b/volumes" Jan 22 09:06:43 crc kubenswrapper[4681]: I0122 09:06:43.470289 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" path="/var/lib/kubelet/pods/cafe2ee6-7f62-4d78-8e7e-de58c8506696/volumes" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.049924 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.050918 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" podUID="1e547b1a-19db-4893-823c-945fcbc64975" containerName="controller-manager" containerID="cri-o://698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7" gracePeriod=30 Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.067683 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.067902 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" podUID="cc8edd67-e844-4fc8-af89-92b332bceea1" containerName="route-controller-manager" containerID="cri-o://e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda" gracePeriod=30 Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.639723 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.714308 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.766275 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k9jn\" (UniqueName: \"kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn\") pod \"cc8edd67-e844-4fc8-af89-92b332bceea1\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.766324 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config\") pod \"cc8edd67-e844-4fc8-af89-92b332bceea1\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.766371 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert\") pod \"cc8edd67-e844-4fc8-af89-92b332bceea1\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.766395 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca\") pod \"cc8edd67-e844-4fc8-af89-92b332bceea1\" (UID: \"cc8edd67-e844-4fc8-af89-92b332bceea1\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.767008 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc8edd67-e844-4fc8-af89-92b332bceea1" (UID: "cc8edd67-e844-4fc8-af89-92b332bceea1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.767860 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config" (OuterVolumeSpecName: "config") pod "cc8edd67-e844-4fc8-af89-92b332bceea1" (UID: "cc8edd67-e844-4fc8-af89-92b332bceea1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.771721 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc8edd67-e844-4fc8-af89-92b332bceea1" (UID: "cc8edd67-e844-4fc8-af89-92b332bceea1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.771861 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn" (OuterVolumeSpecName: "kube-api-access-8k9jn") pod "cc8edd67-e844-4fc8-af89-92b332bceea1" (UID: "cc8edd67-e844-4fc8-af89-92b332bceea1"). InnerVolumeSpecName "kube-api-access-8k9jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867249 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca\") pod \"1e547b1a-19db-4893-823c-945fcbc64975\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867391 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles\") pod \"1e547b1a-19db-4893-823c-945fcbc64975\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867466 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmkgk\" (UniqueName: \"kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk\") pod \"1e547b1a-19db-4893-823c-945fcbc64975\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867561 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert\") pod \"1e547b1a-19db-4893-823c-945fcbc64975\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867609 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config\") pod \"1e547b1a-19db-4893-823c-945fcbc64975\" (UID: \"1e547b1a-19db-4893-823c-945fcbc64975\") " Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867918 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k9jn\" (UniqueName: \"kubernetes.io/projected/cc8edd67-e844-4fc8-af89-92b332bceea1-kube-api-access-8k9jn\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867951 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867969 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc8edd67-e844-4fc8-af89-92b332bceea1-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.867986 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc8edd67-e844-4fc8-af89-92b332bceea1-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.868188 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1e547b1a-19db-4893-823c-945fcbc64975" (UID: "1e547b1a-19db-4893-823c-945fcbc64975"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.868356 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config" (OuterVolumeSpecName: "config") pod "1e547b1a-19db-4893-823c-945fcbc64975" (UID: "1e547b1a-19db-4893-823c-945fcbc64975"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.868697 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca" (OuterVolumeSpecName: "client-ca") pod "1e547b1a-19db-4893-823c-945fcbc64975" (UID: "1e547b1a-19db-4893-823c-945fcbc64975"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.871324 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1e547b1a-19db-4893-823c-945fcbc64975" (UID: "1e547b1a-19db-4893-823c-945fcbc64975"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.871887 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk" (OuterVolumeSpecName: "kube-api-access-pmkgk") pod "1e547b1a-19db-4893-823c-945fcbc64975" (UID: "1e547b1a-19db-4893-823c-945fcbc64975"). InnerVolumeSpecName "kube-api-access-pmkgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.969152 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.969189 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmkgk\" (UniqueName: \"kubernetes.io/projected/1e547b1a-19db-4893-823c-945fcbc64975-kube-api-access-pmkgk\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.969200 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e547b1a-19db-4893-823c-945fcbc64975-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.969209 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:53 crc kubenswrapper[4681]: I0122 09:06:53.969218 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e547b1a-19db-4893-823c-945fcbc64975-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.051050 4681 generic.go:334] "Generic (PLEG): container finished" podID="1e547b1a-19db-4893-823c-945fcbc64975" containerID="698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7" exitCode=0 Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.051109 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" event={"ID":"1e547b1a-19db-4893-823c-945fcbc64975","Type":"ContainerDied","Data":"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7"} Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.051154 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.051180 4681 scope.go:117] "RemoveContainer" containerID="698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.051167 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-zh4xs" event={"ID":"1e547b1a-19db-4893-823c-945fcbc64975","Type":"ContainerDied","Data":"ad2039191bdda81e076f34bc6f6cebe045f56f1f4e5154b259c8e1337df41b71"} Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.053814 4681 generic.go:334] "Generic (PLEG): container finished" podID="cc8edd67-e844-4fc8-af89-92b332bceea1" containerID="e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda" exitCode=0 Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.053849 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" event={"ID":"cc8edd67-e844-4fc8-af89-92b332bceea1","Type":"ContainerDied","Data":"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda"} Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.053868 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" event={"ID":"cc8edd67-e844-4fc8-af89-92b332bceea1","Type":"ContainerDied","Data":"7ff8c79c9468ada3fba40aa46d504cf0642c3b07f15826a7aed1c218275fafd0"} Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.053881 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.087220 4681 scope.go:117] "RemoveContainer" containerID="698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7" Jan 22 09:06:54 crc kubenswrapper[4681]: E0122 09:06:54.088827 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7\": container with ID starting with 698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7 not found: ID does not exist" containerID="698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.088892 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7"} err="failed to get container status \"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7\": rpc error: code = NotFound desc = could not find container \"698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7\": container with ID starting with 698333448efe4d6d3b3708d606400cd097fe96f9e7507aeb5f1d1a1e42ac3bc7 not found: ID does not exist" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.088933 4681 scope.go:117] "RemoveContainer" containerID="e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.096211 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.101864 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-8t4vx"] Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.104703 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.109105 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-zh4xs"] Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.113906 4681 scope.go:117] "RemoveContainer" containerID="e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda" Jan 22 09:06:54 crc kubenswrapper[4681]: E0122 09:06:54.114482 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda\": container with ID starting with e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda not found: ID does not exist" containerID="e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda" Jan 22 09:06:54 crc kubenswrapper[4681]: I0122 09:06:54.114517 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda"} err="failed to get container status \"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda\": rpc error: code = NotFound desc = could not find container \"e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda\": container with ID starting with e7732ec3146aa7a86dba10405929c6cfc9dfa31c239ddb2474908d6cdb9a4fda not found: ID does not exist" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.030454 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031182 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031215 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031245 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031327 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031362 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031383 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031410 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" containerName="marketplace-operator" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031427 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" containerName="marketplace-operator" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031445 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031462 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031488 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031504 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031524 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031540 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031563 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031579 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031602 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031617 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031634 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8edd67-e844-4fc8-af89-92b332bceea1" containerName="route-controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031655 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8edd67-e844-4fc8-af89-92b332bceea1" containerName="route-controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031681 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e547b1a-19db-4893-823c-945fcbc64975" containerName="controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031697 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e547b1a-19db-4893-823c-945fcbc64975" containerName="controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031716 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031732 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="extract-content" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031749 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031764 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031790 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031808 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: E0122 09:06:55.031830 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.031846 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="extract-utilities" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032051 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafe2ee6-7f62-4d78-8e7e-de58c8506696" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032076 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c6a28bd-8a14-4eb3-a655-e7cb5cbe0d9b" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032099 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d75b145-9547-49b4-9aea-652ea33cb371" containerName="marketplace-operator" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032127 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e547b1a-19db-4893-823c-945fcbc64975" containerName="controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032190 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ca3012-b9cb-46cd-b37c-4a74472c3fef" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032222 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8edd67-e844-4fc8-af89-92b332bceea1" containerName="route-controller-manager" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.032240 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e85041c-4d16-4a52-ae78-e3dc2d1e81f9" containerName="registry-server" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.033044 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.035293 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.035630 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.035845 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.039997 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.040018 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.039998 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.042285 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.043306 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.049617 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.050443 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.051138 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.051486 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.051808 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.054093 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.054317 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.070814 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.072067 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.185454 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.185536 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92n8c\" (UniqueName: \"kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.185669 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.185788 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.185860 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.186083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.186133 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.186205 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.186257 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn6tt\" (UniqueName: \"kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288094 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288180 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288247 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288379 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288416 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn6tt\" (UniqueName: \"kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288454 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288500 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92n8c\" (UniqueName: \"kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.288539 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.289412 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.289438 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.290178 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.290724 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.291687 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.300599 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.311086 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.317405 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn6tt\" (UniqueName: \"kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt\") pod \"controller-manager-7665689847-mg9hr\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.318773 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92n8c\" (UniqueName: \"kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c\") pod \"route-controller-manager-6df6cd4c5-ft6xn\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.358662 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.376488 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.466541 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e547b1a-19db-4893-823c-945fcbc64975" path="/var/lib/kubelet/pods/1e547b1a-19db-4893-823c-945fcbc64975/volumes" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.468117 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8edd67-e844-4fc8-af89-92b332bceea1" path="/var/lib/kubelet/pods/cc8edd67-e844-4fc8-af89-92b332bceea1/volumes" Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.632198 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:06:55 crc kubenswrapper[4681]: W0122 09:06:55.635558 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9756733_abd0_4b98_85c1_ee2e0847a38f.slice/crio-a0f1fe1ce5bd9b860d0f43389e50340832a62ab611235b33dc99c36337a8c336 WatchSource:0}: Error finding container a0f1fe1ce5bd9b860d0f43389e50340832a62ab611235b33dc99c36337a8c336: Status 404 returned error can't find the container with id a0f1fe1ce5bd9b860d0f43389e50340832a62ab611235b33dc99c36337a8c336 Jan 22 09:06:55 crc kubenswrapper[4681]: I0122 09:06:55.772042 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:06:55 crc kubenswrapper[4681]: W0122 09:06:55.779405 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dfde443_6763_40e4_8465_3702c8dbf971.slice/crio-1aa3afc9518fbe7f4c7b38d1917a2b5be60a849ce5c8a7444373d08bd795b428 WatchSource:0}: Error finding container 1aa3afc9518fbe7f4c7b38d1917a2b5be60a849ce5c8a7444373d08bd795b428: Status 404 returned error can't find the container with id 1aa3afc9518fbe7f4c7b38d1917a2b5be60a849ce5c8a7444373d08bd795b428 Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.031643 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.031963 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.032007 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.032540 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.032598 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815" gracePeriod=600 Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.068755 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" event={"ID":"7dfde443-6763-40e4-8465-3702c8dbf971","Type":"ContainerStarted","Data":"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d"} Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.068817 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" event={"ID":"7dfde443-6763-40e4-8465-3702c8dbf971","Type":"ContainerStarted","Data":"1aa3afc9518fbe7f4c7b38d1917a2b5be60a849ce5c8a7444373d08bd795b428"} Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.068848 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.070299 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" event={"ID":"f9756733-abd0-4b98-85c1-ee2e0847a38f","Type":"ContainerStarted","Data":"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0"} Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.070353 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" event={"ID":"f9756733-abd0-4b98-85c1-ee2e0847a38f","Type":"ContainerStarted","Data":"a0f1fe1ce5bd9b860d0f43389e50340832a62ab611235b33dc99c36337a8c336"} Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.070519 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.082034 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.084898 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" podStartSLOduration=3.084881192 podStartE2EDuration="3.084881192s" podCreationTimestamp="2026-01-22 09:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:56.082290404 +0000 UTC m=+206.908200939" watchObservedRunningTime="2026-01-22 09:06:56.084881192 +0000 UTC m=+206.910791697" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.454375 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:06:56 crc kubenswrapper[4681]: I0122 09:06:56.476093 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" podStartSLOduration=3.476073463 podStartE2EDuration="3.476073463s" podCreationTimestamp="2026-01-22 09:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:06:56.102845129 +0000 UTC m=+206.928755634" watchObservedRunningTime="2026-01-22 09:06:56.476073463 +0000 UTC m=+207.301983968" Jan 22 09:06:57 crc kubenswrapper[4681]: I0122 09:06:57.082894 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815" exitCode=0 Jan 22 09:06:57 crc kubenswrapper[4681]: I0122 09:06:57.082912 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815"} Jan 22 09:06:57 crc kubenswrapper[4681]: I0122 09:06:57.084384 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9"} Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.038239 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fd8nb"] Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.039634 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.051149 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fd8nb"] Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.174866 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-bound-sa-token\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.174909 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-trusted-ca\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.174946 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/146b1362-c7d6-41cf-a1bf-bc7fee84562b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.174969 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/146b1362-c7d6-41cf-a1bf-bc7fee84562b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.174990 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-tls\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.175015 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqc64\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-kube-api-access-nqc64\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.175039 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-certificates\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.175068 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.198005 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.276695 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-bound-sa-token\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277056 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-trusted-ca\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277196 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/146b1362-c7d6-41cf-a1bf-bc7fee84562b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277327 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/146b1362-c7d6-41cf-a1bf-bc7fee84562b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277441 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-tls\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277550 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqc64\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-kube-api-access-nqc64\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277657 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-certificates\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.277970 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/146b1362-c7d6-41cf-a1bf-bc7fee84562b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.278991 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-certificates\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.279573 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b1362-c7d6-41cf-a1bf-bc7fee84562b-trusted-ca\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.284929 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/146b1362-c7d6-41cf-a1bf-bc7fee84562b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.285006 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-registry-tls\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.299367 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-bound-sa-token\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.302882 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqc64\" (UniqueName: \"kubernetes.io/projected/146b1362-c7d6-41cf-a1bf-bc7fee84562b-kube-api-access-nqc64\") pod \"image-registry-66df7c8f76-fd8nb\" (UID: \"146b1362-c7d6-41cf-a1bf-bc7fee84562b\") " pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.358252 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:53 crc kubenswrapper[4681]: I0122 09:07:53.845368 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fd8nb"] Jan 22 09:07:54 crc kubenswrapper[4681]: I0122 09:07:54.440803 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" event={"ID":"146b1362-c7d6-41cf-a1bf-bc7fee84562b","Type":"ContainerStarted","Data":"6d3e10de6fbb1166d62394895c9f2ef76a5f561de081d469922af7f4b71a03ef"} Jan 22 09:07:54 crc kubenswrapper[4681]: I0122 09:07:54.441110 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" event={"ID":"146b1362-c7d6-41cf-a1bf-bc7fee84562b","Type":"ContainerStarted","Data":"ba0aed42474d864ff807e0057c6ad01512d0d41c17b347a70a535e94d577c348"} Jan 22 09:07:54 crc kubenswrapper[4681]: I0122 09:07:54.441139 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:07:54 crc kubenswrapper[4681]: I0122 09:07:54.461739 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" podStartSLOduration=1.46171322 podStartE2EDuration="1.46171322s" podCreationTimestamp="2026-01-22 09:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:07:54.459051239 +0000 UTC m=+265.284961784" watchObservedRunningTime="2026-01-22 09:07:54.46171322 +0000 UTC m=+265.287623765" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.071692 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lt9z"] Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.075664 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.083894 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.086362 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lt9z"] Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.140397 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqszh\" (UniqueName: \"kubernetes.io/projected/106c3866-7eec-46b8-bf93-cd3edad7c59c-kube-api-access-nqszh\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.140458 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-catalog-content\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.140560 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-utilities\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.241664 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-utilities\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.241822 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqszh\" (UniqueName: \"kubernetes.io/projected/106c3866-7eec-46b8-bf93-cd3edad7c59c-kube-api-access-nqszh\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.241863 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-catalog-content\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.242460 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-utilities\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.242599 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/106c3866-7eec-46b8-bf93-cd3edad7c59c-catalog-content\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.257802 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rbqlg"] Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.260753 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.263342 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.270955 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbqlg"] Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.293397 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqszh\" (UniqueName: \"kubernetes.io/projected/106c3866-7eec-46b8-bf93-cd3edad7c59c-kube-api-access-nqszh\") pod \"redhat-operators-8lt9z\" (UID: \"106c3866-7eec-46b8-bf93-cd3edad7c59c\") " pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.419198 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.445671 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dg6\" (UniqueName: \"kubernetes.io/projected/b7828afb-82af-4b1c-a8f8-900963d42fd1-kube-api-access-96dg6\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.445928 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-catalog-content\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.445988 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-utilities\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.547234 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-catalog-content\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.547571 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-utilities\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.547719 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dg6\" (UniqueName: \"kubernetes.io/projected/b7828afb-82af-4b1c-a8f8-900963d42fd1-kube-api-access-96dg6\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.549210 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-catalog-content\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.550346 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7828afb-82af-4b1c-a8f8-900963d42fd1-utilities\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.582601 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dg6\" (UniqueName: \"kubernetes.io/projected/b7828afb-82af-4b1c-a8f8-900963d42fd1-kube-api-access-96dg6\") pod \"community-operators-rbqlg\" (UID: \"b7828afb-82af-4b1c-a8f8-900963d42fd1\") " pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.689707 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lt9z"] Jan 22 09:08:04 crc kubenswrapper[4681]: I0122 09:08:04.881828 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.350077 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbqlg"] Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.508348 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbqlg" event={"ID":"b7828afb-82af-4b1c-a8f8-900963d42fd1","Type":"ContainerStarted","Data":"c768e8faf6bcb0c7ea688943cc5f07ba73e31fa1d49de40accc17cd26cfbcdcf"} Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.508391 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbqlg" event={"ID":"b7828afb-82af-4b1c-a8f8-900963d42fd1","Type":"ContainerStarted","Data":"cae1bca3b23e50c3c5418fddabca3f6f5eb6e0d97b8e70f8b6560f25370ca520"} Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.511477 4681 generic.go:334] "Generic (PLEG): container finished" podID="106c3866-7eec-46b8-bf93-cd3edad7c59c" containerID="de3de35a03b6f3e0ee8e3c0cfd04c35fcbb2ade6683f8bf9c3088a468adc7c2d" exitCode=0 Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.511546 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lt9z" event={"ID":"106c3866-7eec-46b8-bf93-cd3edad7c59c","Type":"ContainerDied","Data":"de3de35a03b6f3e0ee8e3c0cfd04c35fcbb2ade6683f8bf9c3088a468adc7c2d"} Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.511711 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lt9z" event={"ID":"106c3866-7eec-46b8-bf93-cd3edad7c59c","Type":"ContainerStarted","Data":"99768cdad7b29d35bb4719832a73e2854740c709b3cd6c93bbaa06f6d8da0af3"} Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.662564 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.665999 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.668651 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.674398 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.764836 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.764920 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfkfm\" (UniqueName: \"kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.765091 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.866401 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.866464 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfkfm\" (UniqueName: \"kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.866515 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.867021 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.867674 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.897637 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfkfm\" (UniqueName: \"kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm\") pod \"certified-operators-7jtrb\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:05 crc kubenswrapper[4681]: I0122 09:08:05.996370 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.444615 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.516867 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerStarted","Data":"521a74266c24908fe8216a027bd04ad51e37d4c71e22fe7d16f0bf49a20c6d96"} Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.518151 4681 generic.go:334] "Generic (PLEG): container finished" podID="b7828afb-82af-4b1c-a8f8-900963d42fd1" containerID="c768e8faf6bcb0c7ea688943cc5f07ba73e31fa1d49de40accc17cd26cfbcdcf" exitCode=0 Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.518186 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbqlg" event={"ID":"b7828afb-82af-4b1c-a8f8-900963d42fd1","Type":"ContainerDied","Data":"c768e8faf6bcb0c7ea688943cc5f07ba73e31fa1d49de40accc17cd26cfbcdcf"} Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.861845 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.863059 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.872342 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.890143 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.981362 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.981586 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzsjl\" (UniqueName: \"kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:06 crc kubenswrapper[4681]: I0122 09:08:06.981636 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.083534 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.083626 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzsjl\" (UniqueName: \"kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.083655 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.084137 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.084372 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.108932 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzsjl\" (UniqueName: \"kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl\") pod \"redhat-marketplace-zz8lv\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.203217 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.527968 4681 generic.go:334] "Generic (PLEG): container finished" podID="94bafc7c-e628-4827-b1b7-f016d562bd9f" containerID="b8a3b890efb1a9deeac7f0ba9e3f37ad70b72569f722f8f9fba26c690c9537e3" exitCode=0 Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.529008 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerDied","Data":"b8a3b890efb1a9deeac7f0ba9e3f37ad70b72569f722f8f9fba26c690c9537e3"} Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.531115 4681 generic.go:334] "Generic (PLEG): container finished" podID="106c3866-7eec-46b8-bf93-cd3edad7c59c" containerID="ec64a5981d811aab782d9b0db92c9bf23e1845e6b346038044926f6a6a42ce1b" exitCode=0 Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.531161 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lt9z" event={"ID":"106c3866-7eec-46b8-bf93-cd3edad7c59c","Type":"ContainerDied","Data":"ec64a5981d811aab782d9b0db92c9bf23e1845e6b346038044926f6a6a42ce1b"} Jan 22 09:08:07 crc kubenswrapper[4681]: I0122 09:08:07.682229 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:08:07 crc kubenswrapper[4681]: W0122 09:08:07.686902 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd42106_064b_47b9_8d13_19d1e5ed4959.slice/crio-133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc WatchSource:0}: Error finding container 133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc: Status 404 returned error can't find the container with id 133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc Jan 22 09:08:08 crc kubenswrapper[4681]: I0122 09:08:08.536103 4681 generic.go:334] "Generic (PLEG): container finished" podID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerID="8626f50a80c0476f044b7bc5f0b4744af38b9d914f1c0f072024d8aaa282eee6" exitCode=0 Jan 22 09:08:08 crc kubenswrapper[4681]: I0122 09:08:08.536198 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerDied","Data":"8626f50a80c0476f044b7bc5f0b4744af38b9d914f1c0f072024d8aaa282eee6"} Jan 22 09:08:08 crc kubenswrapper[4681]: I0122 09:08:08.536272 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerStarted","Data":"133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc"} Jan 22 09:08:08 crc kubenswrapper[4681]: I0122 09:08:08.538095 4681 generic.go:334] "Generic (PLEG): container finished" podID="b7828afb-82af-4b1c-a8f8-900963d42fd1" containerID="df766190bd2d68d7b6f82dfac6ac1af6e39cc284494360fced429350e6760324" exitCode=0 Jan 22 09:08:08 crc kubenswrapper[4681]: I0122 09:08:08.538118 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbqlg" event={"ID":"b7828afb-82af-4b1c-a8f8-900963d42fd1","Type":"ContainerDied","Data":"df766190bd2d68d7b6f82dfac6ac1af6e39cc284494360fced429350e6760324"} Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.546483 4681 generic.go:334] "Generic (PLEG): container finished" podID="94bafc7c-e628-4827-b1b7-f016d562bd9f" containerID="91c115c5efcb89e764b409966cb61141f03a027a34dbccddf0e1df7bb5d56c44" exitCode=0 Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.546558 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerDied","Data":"91c115c5efcb89e764b409966cb61141f03a027a34dbccddf0e1df7bb5d56c44"} Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.549879 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lt9z" event={"ID":"106c3866-7eec-46b8-bf93-cd3edad7c59c","Type":"ContainerStarted","Data":"65d2c590783822ba758bdc0e470a61cddc1b11ad0b8ad76bcf556c7354fa1935"} Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.551841 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbqlg" event={"ID":"b7828afb-82af-4b1c-a8f8-900963d42fd1","Type":"ContainerStarted","Data":"bfa036fda4550bdbf64dfb156022aff8f2e0773dac733a6c4d772c5e0303c3ed"} Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.597090 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lt9z" podStartSLOduration=2.842922482 podStartE2EDuration="5.59706466s" podCreationTimestamp="2026-01-22 09:08:04 +0000 UTC" firstStartedPulling="2026-01-22 09:08:05.513331624 +0000 UTC m=+276.339242129" lastFinishedPulling="2026-01-22 09:08:08.267473772 +0000 UTC m=+279.093384307" observedRunningTime="2026-01-22 09:08:09.593437414 +0000 UTC m=+280.419347969" watchObservedRunningTime="2026-01-22 09:08:09.59706466 +0000 UTC m=+280.422975205" Jan 22 09:08:09 crc kubenswrapper[4681]: I0122 09:08:09.621825 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rbqlg" podStartSLOduration=3.143882111 podStartE2EDuration="5.621800647s" podCreationTimestamp="2026-01-22 09:08:04 +0000 UTC" firstStartedPulling="2026-01-22 09:08:06.519249288 +0000 UTC m=+277.345159793" lastFinishedPulling="2026-01-22 09:08:08.997167814 +0000 UTC m=+279.823078329" observedRunningTime="2026-01-22 09:08:09.613340352 +0000 UTC m=+280.439250907" watchObservedRunningTime="2026-01-22 09:08:09.621800647 +0000 UTC m=+280.447711182" Jan 22 09:08:10 crc kubenswrapper[4681]: I0122 09:08:10.558784 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerStarted","Data":"213843b50db23e594e7d27cbd5d530e8e3b588e3fbf79ca6f0d351540da5de56"} Jan 22 09:08:10 crc kubenswrapper[4681]: I0122 09:08:10.560899 4681 generic.go:334] "Generic (PLEG): container finished" podID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerID="c2e7f95f66ccd1ee1a7321b3e6c94ba1337f60ef235958530fac73db2b9815b9" exitCode=0 Jan 22 09:08:10 crc kubenswrapper[4681]: I0122 09:08:10.560958 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerDied","Data":"c2e7f95f66ccd1ee1a7321b3e6c94ba1337f60ef235958530fac73db2b9815b9"} Jan 22 09:08:10 crc kubenswrapper[4681]: I0122 09:08:10.590672 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7jtrb" podStartSLOduration=2.9020437599999998 podStartE2EDuration="5.590648688s" podCreationTimestamp="2026-01-22 09:08:05 +0000 UTC" firstStartedPulling="2026-01-22 09:08:07.529974442 +0000 UTC m=+278.355884987" lastFinishedPulling="2026-01-22 09:08:10.21857941 +0000 UTC m=+281.044489915" observedRunningTime="2026-01-22 09:08:10.589147599 +0000 UTC m=+281.415058104" watchObservedRunningTime="2026-01-22 09:08:10.590648688 +0000 UTC m=+281.416559223" Jan 22 09:08:11 crc kubenswrapper[4681]: I0122 09:08:11.580227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerStarted","Data":"e21c61baaab33aeacadf349c125a6f3f3d923c5ac12d42ee6b26b6e9cdc7b2aa"} Jan 22 09:08:11 crc kubenswrapper[4681]: I0122 09:08:11.603184 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zz8lv" podStartSLOduration=3.079827419 podStartE2EDuration="5.603142688s" podCreationTimestamp="2026-01-22 09:08:06 +0000 UTC" firstStartedPulling="2026-01-22 09:08:08.537585523 +0000 UTC m=+279.363496028" lastFinishedPulling="2026-01-22 09:08:11.060900772 +0000 UTC m=+281.886811297" observedRunningTime="2026-01-22 09:08:11.600846557 +0000 UTC m=+282.426757072" watchObservedRunningTime="2026-01-22 09:08:11.603142688 +0000 UTC m=+282.429053203" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.030995 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.031449 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerName="controller-manager" containerID="cri-o://1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0" gracePeriod=30 Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.050533 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.050723 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" containerName="route-controller-manager" containerID="cri-o://0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d" gracePeriod=30 Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.362894 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fd8nb" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:13.414907 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:14.419838 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:14.421093 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:14.882173 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:14.882255 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:14.961701 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:15.359914 4681 patch_prober.go:28] interesting pod/route-controller-manager-6df6cd4c5-ft6xn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:15.359973 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:15.377604 4681 patch_prober.go:28] interesting pod/controller-manager-7665689847-mg9hr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:15.377654 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Jan 22 09:08:15 crc kubenswrapper[4681]: I0122 09:08:15.463142 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lt9z" podUID="106c3866-7eec-46b8-bf93-cd3edad7c59c" containerName="registry-server" probeResult="failure" output=< Jan 22 09:08:15 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:08:15 crc kubenswrapper[4681]: > Jan 22 09:08:16 crc kubenswrapper[4681]: I0122 09:08:15.650918 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rbqlg" Jan 22 09:08:16 crc kubenswrapper[4681]: I0122 09:08:15.996572 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:16 crc kubenswrapper[4681]: I0122 09:08:15.996887 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:16 crc kubenswrapper[4681]: I0122 09:08:16.043334 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:16 crc kubenswrapper[4681]: I0122 09:08:16.682484 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.203526 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.203651 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.250786 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.259112 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.262900 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.308078 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-rbwmc"] Jan 22 09:08:17 crc kubenswrapper[4681]: E0122 09:08:17.308465 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerName="controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.309643 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerName="controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: E0122 09:08:17.309689 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" containerName="route-controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.309699 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" containerName="route-controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.309855 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" containerName="route-controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.309882 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerName="controller-manager" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.310456 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.321175 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-rbwmc"] Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.457932 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92n8c\" (UniqueName: \"kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c\") pod \"7dfde443-6763-40e4-8465-3702c8dbf971\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458024 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config\") pod \"f9756733-abd0-4b98-85c1-ee2e0847a38f\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458107 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles\") pod \"f9756733-abd0-4b98-85c1-ee2e0847a38f\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert\") pod \"f9756733-abd0-4b98-85c1-ee2e0847a38f\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458175 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config\") pod \"7dfde443-6763-40e4-8465-3702c8dbf971\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458224 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca\") pod \"f9756733-abd0-4b98-85c1-ee2e0847a38f\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458320 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert\") pod \"7dfde443-6763-40e4-8465-3702c8dbf971\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458841 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config" (OuterVolumeSpecName: "config") pod "7dfde443-6763-40e4-8465-3702c8dbf971" (UID: "7dfde443-6763-40e4-8465-3702c8dbf971"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458952 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca" (OuterVolumeSpecName: "client-ca") pod "f9756733-abd0-4b98-85c1-ee2e0847a38f" (UID: "f9756733-abd0-4b98-85c1-ee2e0847a38f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.458961 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f9756733-abd0-4b98-85c1-ee2e0847a38f" (UID: "f9756733-abd0-4b98-85c1-ee2e0847a38f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.459144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca\") pod \"7dfde443-6763-40e4-8465-3702c8dbf971\" (UID: \"7dfde443-6763-40e4-8465-3702c8dbf971\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.459238 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn6tt\" (UniqueName: \"kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt\") pod \"f9756733-abd0-4b98-85c1-ee2e0847a38f\" (UID: \"f9756733-abd0-4b98-85c1-ee2e0847a38f\") " Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.459446 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config" (OuterVolumeSpecName: "config") pod "f9756733-abd0-4b98-85c1-ee2e0847a38f" (UID: "f9756733-abd0-4b98-85c1-ee2e0847a38f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.459530 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca" (OuterVolumeSpecName: "client-ca") pod "7dfde443-6763-40e4-8465-3702c8dbf971" (UID: "7dfde443-6763-40e4-8465-3702c8dbf971"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.459994 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8151ae66-f3ee-43f0-8557-c85bce7c86b2-serving-cert\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460070 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-config\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460164 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460217 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-client-ca\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460323 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v48l\" (UniqueName: \"kubernetes.io/projected/8151ae66-f3ee-43f0-8557-c85bce7c86b2-kube-api-access-4v48l\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460436 4681 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460470 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460489 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460508 4681 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7dfde443-6763-40e4-8465-3702c8dbf971-client-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.460526 4681 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9756733-abd0-4b98-85c1-ee2e0847a38f-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.463385 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt" (OuterVolumeSpecName: "kube-api-access-vn6tt") pod "f9756733-abd0-4b98-85c1-ee2e0847a38f" (UID: "f9756733-abd0-4b98-85c1-ee2e0847a38f"). InnerVolumeSpecName "kube-api-access-vn6tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.465199 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f9756733-abd0-4b98-85c1-ee2e0847a38f" (UID: "f9756733-abd0-4b98-85c1-ee2e0847a38f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.465646 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7dfde443-6763-40e4-8465-3702c8dbf971" (UID: "7dfde443-6763-40e4-8465-3702c8dbf971"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.472304 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c" (OuterVolumeSpecName: "kube-api-access-92n8c") pod "7dfde443-6763-40e4-8465-3702c8dbf971" (UID: "7dfde443-6763-40e4-8465-3702c8dbf971"). InnerVolumeSpecName "kube-api-access-92n8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562302 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-config\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562417 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562486 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-client-ca\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562601 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v48l\" (UniqueName: \"kubernetes.io/projected/8151ae66-f3ee-43f0-8557-c85bce7c86b2-kube-api-access-4v48l\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562762 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8151ae66-f3ee-43f0-8557-c85bce7c86b2-serving-cert\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562876 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9756733-abd0-4b98-85c1-ee2e0847a38f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562921 4681 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dfde443-6763-40e4-8465-3702c8dbf971-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562955 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn6tt\" (UniqueName: \"kubernetes.io/projected/f9756733-abd0-4b98-85c1-ee2e0847a38f-kube-api-access-vn6tt\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.562996 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92n8c\" (UniqueName: \"kubernetes.io/projected/7dfde443-6763-40e4-8465-3702c8dbf971-kube-api-access-92n8c\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.564115 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-client-ca\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.564365 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-config\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.566465 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8151ae66-f3ee-43f0-8557-c85bce7c86b2-proxy-ca-bundles\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.568791 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8151ae66-f3ee-43f0-8557-c85bce7c86b2-serving-cert\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.580984 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v48l\" (UniqueName: \"kubernetes.io/projected/8151ae66-f3ee-43f0-8557-c85bce7c86b2-kube-api-access-4v48l\") pod \"controller-manager-64996fbdb-rbwmc\" (UID: \"8151ae66-f3ee-43f0-8557-c85bce7c86b2\") " pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.619932 4681 generic.go:334] "Generic (PLEG): container finished" podID="7dfde443-6763-40e4-8465-3702c8dbf971" containerID="0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d" exitCode=0 Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.620022 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" event={"ID":"7dfde443-6763-40e4-8465-3702c8dbf971","Type":"ContainerDied","Data":"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d"} Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.620072 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" event={"ID":"7dfde443-6763-40e4-8465-3702c8dbf971","Type":"ContainerDied","Data":"1aa3afc9518fbe7f4c7b38d1917a2b5be60a849ce5c8a7444373d08bd795b428"} Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.620112 4681 scope.go:117] "RemoveContainer" containerID="0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.619999 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.624240 4681 generic.go:334] "Generic (PLEG): container finished" podID="f9756733-abd0-4b98-85c1-ee2e0847a38f" containerID="1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0" exitCode=0 Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.624296 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" event={"ID":"f9756733-abd0-4b98-85c1-ee2e0847a38f","Type":"ContainerDied","Data":"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0"} Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.625244 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" event={"ID":"f9756733-abd0-4b98-85c1-ee2e0847a38f","Type":"ContainerDied","Data":"a0f1fe1ce5bd9b860d0f43389e50340832a62ab611235b33dc99c36337a8c336"} Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.625311 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7665689847-mg9hr" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.627437 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.658192 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.667278 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7665689847-mg9hr"] Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.671768 4681 scope.go:117] "RemoveContainer" containerID="0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d" Jan 22 09:08:17 crc kubenswrapper[4681]: E0122 09:08:17.672196 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d\": container with ID starting with 0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d not found: ID does not exist" containerID="0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.672224 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d"} err="failed to get container status \"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d\": rpc error: code = NotFound desc = could not find container \"0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d\": container with ID starting with 0dc21c3744fc7d701923d24775b1a452c9fadc3ffcdeeca15b8e4a9c1e75277d not found: ID does not exist" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.672243 4681 scope.go:117] "RemoveContainer" containerID="1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.673597 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.679327 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6df6cd4c5-ft6xn"] Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.686342 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.692113 4681 scope.go:117] "RemoveContainer" containerID="1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0" Jan 22 09:08:17 crc kubenswrapper[4681]: E0122 09:08:17.692671 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0\": container with ID starting with 1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0 not found: ID does not exist" containerID="1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0" Jan 22 09:08:17 crc kubenswrapper[4681]: I0122 09:08:17.692711 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0"} err="failed to get container status \"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0\": rpc error: code = NotFound desc = could not find container \"1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0\": container with ID starting with 1f0d04c77a7dd39c196c7704abaaccd31e295caa1a2e4d0a1cdb30a5a7323bd0 not found: ID does not exist" Jan 22 09:08:18 crc kubenswrapper[4681]: I0122 09:08:18.075236 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64996fbdb-rbwmc"] Jan 22 09:08:18 crc kubenswrapper[4681]: W0122 09:08:18.090740 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8151ae66_f3ee_43f0_8557_c85bce7c86b2.slice/crio-a9be2919a658fb737172af86b7975a2ff940cde745b269504e7c48a60d0d02da WatchSource:0}: Error finding container a9be2919a658fb737172af86b7975a2ff940cde745b269504e7c48a60d0d02da: Status 404 returned error can't find the container with id a9be2919a658fb737172af86b7975a2ff940cde745b269504e7c48a60d0d02da Jan 22 09:08:18 crc kubenswrapper[4681]: I0122 09:08:18.633758 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" event={"ID":"8151ae66-f3ee-43f0-8557-c85bce7c86b2","Type":"ContainerStarted","Data":"a9be2919a658fb737172af86b7975a2ff940cde745b269504e7c48a60d0d02da"} Jan 22 09:08:19 crc kubenswrapper[4681]: I0122 09:08:19.465594 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfde443-6763-40e4-8465-3702c8dbf971" path="/var/lib/kubelet/pods/7dfde443-6763-40e4-8465-3702c8dbf971/volumes" Jan 22 09:08:19 crc kubenswrapper[4681]: I0122 09:08:19.467311 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9756733-abd0-4b98-85c1-ee2e0847a38f" path="/var/lib/kubelet/pods/f9756733-abd0-4b98-85c1-ee2e0847a38f/volumes" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.153386 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp"] Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.154471 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.159070 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.159164 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.160364 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.161617 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.161960 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.164236 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.170699 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp"] Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.234190 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvsq\" (UniqueName: \"kubernetes.io/projected/a126567d-1e3b-4409-a7fe-49f490c126e8-kube-api-access-mgvsq\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.234695 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a126567d-1e3b-4409-a7fe-49f490c126e8-serving-cert\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.234849 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-config\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.234955 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-client-ca\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.336762 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvsq\" (UniqueName: \"kubernetes.io/projected/a126567d-1e3b-4409-a7fe-49f490c126e8-kube-api-access-mgvsq\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.336805 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a126567d-1e3b-4409-a7fe-49f490c126e8-serving-cert\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.336872 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-config\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.336901 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-client-ca\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.338147 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-client-ca\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.338864 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a126567d-1e3b-4409-a7fe-49f490c126e8-config\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.345960 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a126567d-1e3b-4409-a7fe-49f490c126e8-serving-cert\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.364146 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvsq\" (UniqueName: \"kubernetes.io/projected/a126567d-1e3b-4409-a7fe-49f490c126e8-kube-api-access-mgvsq\") pod \"route-controller-manager-96df84589-tlsjp\" (UID: \"a126567d-1e3b-4409-a7fe-49f490c126e8\") " pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.543069 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.655225 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" event={"ID":"8151ae66-f3ee-43f0-8557-c85bce7c86b2","Type":"ContainerStarted","Data":"a5f67e1948d357671c6b215bcbb6ba789af5163ab23f7005e97376bd32a4ec73"} Jan 22 09:08:20 crc kubenswrapper[4681]: I0122 09:08:20.787302 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp"] Jan 22 09:08:20 crc kubenswrapper[4681]: W0122 09:08:20.794166 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda126567d_1e3b_4409_a7fe_49f490c126e8.slice/crio-c76f45e76b3fb966dd7f297ffb6c744bee39f9a76557d9f9ff4096445263ec32 WatchSource:0}: Error finding container c76f45e76b3fb966dd7f297ffb6c744bee39f9a76557d9f9ff4096445263ec32: Status 404 returned error can't find the container with id c76f45e76b3fb966dd7f297ffb6c744bee39f9a76557d9f9ff4096445263ec32 Jan 22 09:08:21 crc kubenswrapper[4681]: I0122 09:08:21.664578 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" event={"ID":"a126567d-1e3b-4409-a7fe-49f490c126e8","Type":"ContainerStarted","Data":"c76f45e76b3fb966dd7f297ffb6c744bee39f9a76557d9f9ff4096445263ec32"} Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.671664 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" event={"ID":"a126567d-1e3b-4409-a7fe-49f490c126e8","Type":"ContainerStarted","Data":"00665a4e04e6d2ba0d282d63ebf7445d4753effd6ab87b24cffa37c0424b51a4"} Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.672100 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.672119 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.676654 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.691632 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" podStartSLOduration=9.691620779 podStartE2EDuration="9.691620779s" podCreationTimestamp="2026-01-22 09:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:08:22.690408367 +0000 UTC m=+293.516318882" watchObservedRunningTime="2026-01-22 09:08:22.691620779 +0000 UTC m=+293.517531284" Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.709629 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64996fbdb-rbwmc" podStartSLOduration=9.709610017 podStartE2EDuration="9.709610017s" podCreationTimestamp="2026-01-22 09:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:08:22.70747137 +0000 UTC m=+293.533381875" watchObservedRunningTime="2026-01-22 09:08:22.709610017 +0000 UTC m=+293.535520522" Jan 22 09:08:22 crc kubenswrapper[4681]: I0122 09:08:22.750837 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-96df84589-tlsjp" Jan 22 09:08:24 crc kubenswrapper[4681]: I0122 09:08:24.485121 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:24 crc kubenswrapper[4681]: I0122 09:08:24.562405 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lt9z" Jan 22 09:08:29 crc kubenswrapper[4681]: I0122 09:08:29.320800 4681 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 22 09:08:38 crc kubenswrapper[4681]: I0122 09:08:38.474102 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" podUID="49778c45-8be5-4610-8298-01e06333289c" containerName="registry" containerID="cri-o://b791dfc6198d96c5df5454c0919e251d4300ad70d6223522af5e088182c3ceb0" gracePeriod=30 Jan 22 09:08:38 crc kubenswrapper[4681]: I0122 09:08:38.765286 4681 generic.go:334] "Generic (PLEG): container finished" podID="49778c45-8be5-4610-8298-01e06333289c" containerID="b791dfc6198d96c5df5454c0919e251d4300ad70d6223522af5e088182c3ceb0" exitCode=0 Jan 22 09:08:38 crc kubenswrapper[4681]: I0122 09:08:38.765302 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" event={"ID":"49778c45-8be5-4610-8298-01e06333289c","Type":"ContainerDied","Data":"b791dfc6198d96c5df5454c0919e251d4300ad70d6223522af5e088182c3ceb0"} Jan 22 09:08:38 crc kubenswrapper[4681]: I0122 09:08:38.912709 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.102308 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.102703 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.102750 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5dr4\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.102776 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.102967 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.103024 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.103050 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.103082 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted\") pod \"49778c45-8be5-4610-8298-01e06333289c\" (UID: \"49778c45-8be5-4610-8298-01e06333289c\") " Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.103548 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.104208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.105788 4681 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.105828 4681 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49778c45-8be5-4610-8298-01e06333289c-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.111291 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.111445 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.118936 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.119451 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4" (OuterVolumeSpecName: "kube-api-access-c5dr4") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "kube-api-access-c5dr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.119655 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.120216 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "49778c45-8be5-4610-8298-01e06333289c" (UID: "49778c45-8be5-4610-8298-01e06333289c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.207522 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5dr4\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-kube-api-access-c5dr4\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.207784 4681 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.208057 4681 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49778c45-8be5-4610-8298-01e06333289c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.208162 4681 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49778c45-8be5-4610-8298-01e06333289c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.208329 4681 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49778c45-8be5-4610-8298-01e06333289c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.775704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" event={"ID":"49778c45-8be5-4610-8298-01e06333289c","Type":"ContainerDied","Data":"ad285833ebead20ecc1fec6dea10f3d83342ccda240bd998f750dabcaa6f42c7"} Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.775882 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-c287z" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.776406 4681 scope.go:117] "RemoveContainer" containerID="b791dfc6198d96c5df5454c0919e251d4300ad70d6223522af5e088182c3ceb0" Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.805611 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:08:39 crc kubenswrapper[4681]: I0122 09:08:39.812413 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-c287z"] Jan 22 09:08:41 crc kubenswrapper[4681]: I0122 09:08:41.463777 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49778c45-8be5-4610-8298-01e06333289c" path="/var/lib/kubelet/pods/49778c45-8be5-4610-8298-01e06333289c/volumes" Jan 22 09:08:56 crc kubenswrapper[4681]: I0122 09:08:56.031055 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:08:56 crc kubenswrapper[4681]: I0122 09:08:56.031680 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:09:26 crc kubenswrapper[4681]: I0122 09:09:26.031531 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:09:26 crc kubenswrapper[4681]: I0122 09:09:26.032357 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.031806 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.032352 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.032397 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.032902 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.032965 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9" gracePeriod=600 Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.248910 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9" exitCode=0 Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.248968 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9"} Jan 22 09:09:56 crc kubenswrapper[4681]: I0122 09:09:56.249177 4681 scope.go:117] "RemoveContainer" containerID="dd545a538ee0d4885a7c0851ab88c778c7ae321e7dec7c6afb193d36bbeb4815" Jan 22 09:09:57 crc kubenswrapper[4681]: I0122 09:09:57.257853 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce"} Jan 22 09:10:29 crc kubenswrapper[4681]: I0122 09:10:29.843061 4681 scope.go:117] "RemoveContainer" containerID="10133ec666c98534bf787f2a612f74b75a7e3d49b6de68349e6907ac608160a9" Jan 22 09:10:29 crc kubenswrapper[4681]: I0122 09:10:29.865766 4681 scope.go:117] "RemoveContainer" containerID="797a8489ac4ea3f366b131ea65add3d23402e0f88662eb6b7a2a9c5d925a4124" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.444943 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28zgq"] Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446040 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-controller" containerID="cri-o://6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446098 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="nbdb" containerID="cri-o://fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446243 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="sbdb" containerID="cri-o://33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446283 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="northd" containerID="cri-o://e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446383 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-acl-logging" containerID="cri-o://7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446399 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.446456 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-node" containerID="cri-o://0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.481846 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovnkube-controller" containerID="cri-o://e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" gracePeriod=30 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.764204 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-28zgq_3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0/ovn-acl-logging/0.log" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.765491 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-28zgq_3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0/ovn-controller/0.log" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.766528 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788767 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788838 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788894 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788921 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788948 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788990 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash" (OuterVolumeSpecName: "host-slash") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.788998 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789023 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log" (OuterVolumeSpecName: "node-log") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789052 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789094 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789139 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789192 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789236 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789317 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789361 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789410 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jfkg\" (UniqueName: \"kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789460 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789505 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789512 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket" (OuterVolumeSpecName: "log-socket") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789544 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789580 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789594 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789621 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789639 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789689 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.789736 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns\") pod \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\" (UID: \"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0\") " Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790050 4681 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790083 4681 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-slash\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790137 4681 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-node-log\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790161 4681 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790185 4681 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790206 4681 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-log-socket\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790037 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790142 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790235 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790315 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790342 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790399 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790443 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790483 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790519 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.790987 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.791431 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.796211 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg" (OuterVolumeSpecName: "kube-api-access-8jfkg") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "kube-api-access-8jfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.796649 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.825189 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" (UID: "3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.833991 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jl5g2"] Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834402 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834437 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834457 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="northd" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834470 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="northd" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834496 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-node" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834509 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-node" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834530 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="sbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834541 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="sbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834563 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49778c45-8be5-4610-8298-01e06333289c" containerName="registry" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834576 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="49778c45-8be5-4610-8298-01e06333289c" containerName="registry" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834593 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovnkube-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834605 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovnkube-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834623 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-acl-logging" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834635 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-acl-logging" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834655 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kubecfg-setup" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834667 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kubecfg-setup" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834683 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834695 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: E0122 09:11:26.834710 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="nbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834722 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="nbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834887 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="sbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834907 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="nbdb" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834923 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-ovn-metrics" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834942 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="49778c45-8be5-4610-8298-01e06333289c" containerName="registry" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834959 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovnkube-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834978 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="kube-rbac-proxy-node" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.834995 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-controller" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.835016 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="ovn-acl-logging" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.835033 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerName="northd" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.839672 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.859381 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-28zgq_3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0/ovn-acl-logging/0.log" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.859878 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-28zgq_3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0/ovn-controller/0.log" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860456 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860478 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860486 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860493 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860499 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860505 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" exitCode=0 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860511 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" exitCode=143 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860516 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" exitCode=143 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860552 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860589 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860601 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860610 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860620 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860629 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860639 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860650 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860655 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860662 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860669 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860675 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860680 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860685 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860690 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860695 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860700 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860705 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860710 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860717 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860724 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860731 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860736 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860741 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860746 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860751 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860757 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860762 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860767 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860775 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" event={"ID":"3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0","Type":"ContainerDied","Data":"afa77250c4f77835910ead4aa39d6d70c11f13582f6bfffbbd60b36b29e79354"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860782 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860787 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860793 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860798 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860803 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860808 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860813 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860819 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860824 4681 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860837 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.860962 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28zgq" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.865232 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xpdjl_1976858f-1664-4b36-9929-65cc8fe9d0ad/kube-multus/0.log" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.865448 4681 generic.go:334] "Generic (PLEG): container finished" podID="1976858f-1664-4b36-9929-65cc8fe9d0ad" containerID="c5d3f9a31740c41885595ea6d65e085332d9c898f6c5f2080131d3840cfc5c51" exitCode=2 Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.865522 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xpdjl" event={"ID":"1976858f-1664-4b36-9929-65cc8fe9d0ad","Type":"ContainerDied","Data":"c5d3f9a31740c41885595ea6d65e085332d9c898f6c5f2080131d3840cfc5c51"} Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.866188 4681 scope.go:117] "RemoveContainer" containerID="c5d3f9a31740c41885595ea6d65e085332d9c898f6c5f2080131d3840cfc5c51" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.891745 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-slash\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.891835 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.891892 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-log-socket\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.891916 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-systemd-units\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.891939 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-config\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892012 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-var-lib-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892068 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-node-log\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlckf\" (UniqueName: \"kubernetes.io/projected/7652cec3-8eaa-46df-a209-7babb32e5ec3-kube-api-access-zlckf\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892619 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-script-lib\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-netns\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892675 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-systemd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892732 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-ovn\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892755 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovn-node-metrics-cert\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892792 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-netd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892816 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892839 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-env-overrides\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892864 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-bin\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.892887 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-kubelet\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893386 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893420 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-etc-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893779 4681 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893869 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893882 4681 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893892 4681 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893904 4681 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893914 4681 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893925 4681 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893937 4681 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893946 4681 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893955 4681 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893965 4681 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893975 4681 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893983 4681 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.893992 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jfkg\" (UniqueName: \"kubernetes.io/projected/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0-kube-api-access-8jfkg\") on node \"crc\" DevicePath \"\"" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.917292 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.933175 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28zgq"] Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.937880 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28zgq"] Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.940181 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.965587 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.990417 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995315 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-netd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995371 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995397 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-env-overrides\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995417 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-bin\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995444 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-kubelet\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995464 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-etc-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995487 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995521 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-slash\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995554 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995589 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-log-socket\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995610 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-systemd-units\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995638 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-config\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995671 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-var-lib-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995707 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-node-log\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995727 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlckf\" (UniqueName: \"kubernetes.io/projected/7652cec3-8eaa-46df-a209-7babb32e5ec3-kube-api-access-zlckf\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995750 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-script-lib\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995773 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-netns\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995794 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-systemd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995834 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-ovn\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.995858 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovn-node-metrics-cert\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996299 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-log-socket\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996358 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-netd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996389 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996416 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-ovn-kubernetes\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996433 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-etc-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996443 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-slash\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996438 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-run-netns\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996477 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996485 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-cni-bin\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996542 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-host-kubelet\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996566 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-systemd\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996576 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-var-lib-openvswitch\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996600 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-node-log\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996610 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-run-ovn\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996663 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7652cec3-8eaa-46df-a209-7babb32e5ec3-systemd-units\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996979 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-env-overrides\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.996988 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-script-lib\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.998053 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovnkube-config\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:26 crc kubenswrapper[4681]: I0122 09:11:26.999475 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7652cec3-8eaa-46df-a209-7babb32e5ec3-ovn-node-metrics-cert\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.020114 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlckf\" (UniqueName: \"kubernetes.io/projected/7652cec3-8eaa-46df-a209-7babb32e5ec3-kube-api-access-zlckf\") pod \"ovnkube-node-jl5g2\" (UID: \"7652cec3-8eaa-46df-a209-7babb32e5ec3\") " pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.028362 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.042489 4681 scope.go:117] "RemoveContainer" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.059553 4681 scope.go:117] "RemoveContainer" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.079771 4681 scope.go:117] "RemoveContainer" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.097070 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.098149 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098190 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} err="failed to get container status \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098218 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.098536 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098564 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} err="failed to get container status \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098579 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.098832 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098913 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} err="failed to get container status \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.098932 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.099205 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099225 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} err="failed to get container status \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099241 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.099508 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099526 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} err="failed to get container status \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099542 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.099758 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099783 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} err="failed to get container status \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.099803 4681 scope.go:117] "RemoveContainer" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.100117 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": container with ID starting with 7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e not found: ID does not exist" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.100146 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} err="failed to get container status \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": rpc error: code = NotFound desc = could not find container \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": container with ID starting with 7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.100165 4681 scope.go:117] "RemoveContainer" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.101012 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": container with ID starting with 6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f not found: ID does not exist" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101033 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} err="failed to get container status \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": rpc error: code = NotFound desc = could not find container \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": container with ID starting with 6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101064 4681 scope.go:117] "RemoveContainer" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: E0122 09:11:27.101382 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": container with ID starting with 4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420 not found: ID does not exist" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101403 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} err="failed to get container status \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": rpc error: code = NotFound desc = could not find container \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": container with ID starting with 4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101420 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101631 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} err="failed to get container status \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.101654 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.102601 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} err="failed to get container status \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.102625 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.102846 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} err="failed to get container status \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.102863 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103073 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} err="failed to get container status \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103096 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103497 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} err="failed to get container status \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103522 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103851 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} err="failed to get container status \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.103880 4681 scope.go:117] "RemoveContainer" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.104122 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} err="failed to get container status \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": rpc error: code = NotFound desc = could not find container \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": container with ID starting with 7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.104142 4681 scope.go:117] "RemoveContainer" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.104780 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} err="failed to get container status \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": rpc error: code = NotFound desc = could not find container \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": container with ID starting with 6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.104807 4681 scope.go:117] "RemoveContainer" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.105087 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} err="failed to get container status \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": rpc error: code = NotFound desc = could not find container \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": container with ID starting with 4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.105109 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.106139 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} err="failed to get container status \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.106177 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.106480 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} err="failed to get container status \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.106504 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.106987 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} err="failed to get container status \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.107012 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.107316 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} err="failed to get container status \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.107339 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.107787 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} err="failed to get container status \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.107874 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.108475 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} err="failed to get container status \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.108516 4681 scope.go:117] "RemoveContainer" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.108915 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} err="failed to get container status \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": rpc error: code = NotFound desc = could not find container \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": container with ID starting with 7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.108951 4681 scope.go:117] "RemoveContainer" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109236 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} err="failed to get container status \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": rpc error: code = NotFound desc = could not find container \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": container with ID starting with 6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109279 4681 scope.go:117] "RemoveContainer" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109499 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} err="failed to get container status \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": rpc error: code = NotFound desc = could not find container \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": container with ID starting with 4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109523 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109763 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} err="failed to get container status \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.109787 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.110057 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} err="failed to get container status \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.110087 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.110557 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} err="failed to get container status \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.110612 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.110960 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} err="failed to get container status \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.111001 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.111448 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} err="failed to get container status \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.111495 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.112844 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} err="failed to get container status \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.112911 4681 scope.go:117] "RemoveContainer" containerID="7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.113682 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e"} err="failed to get container status \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": rpc error: code = NotFound desc = could not find container \"7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e\": container with ID starting with 7520930e04b18c1fd48fb6a0e1f03c785c466bdb050bbf30b5cba788e39ed54e not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.113811 4681 scope.go:117] "RemoveContainer" containerID="6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.114142 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f"} err="failed to get container status \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": rpc error: code = NotFound desc = could not find container \"6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f\": container with ID starting with 6da11c414af36e26ad3b43d970f4b653d80312134136d92cf805c70e0a69ca4f not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.114182 4681 scope.go:117] "RemoveContainer" containerID="4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.114941 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420"} err="failed to get container status \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": rpc error: code = NotFound desc = could not find container \"4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420\": container with ID starting with 4dd5ac229ef2b9fc0ff776c87e33ae14542dcad6962d09acfb4926dd2391d420 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.114968 4681 scope.go:117] "RemoveContainer" containerID="e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.116395 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c"} err="failed to get container status \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": rpc error: code = NotFound desc = could not find container \"e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c\": container with ID starting with e36597fbd5efb466e57cefb2ca76927ddf067ce74411bddea5be4bd1624f476c not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.116468 4681 scope.go:117] "RemoveContainer" containerID="33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.117791 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474"} err="failed to get container status \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": rpc error: code = NotFound desc = could not find container \"33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474\": container with ID starting with 33e44177c5a1780fae0fc38c014470ed400e94595627419ffbe736d9e9733474 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.117856 4681 scope.go:117] "RemoveContainer" containerID="fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.118326 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634"} err="failed to get container status \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": rpc error: code = NotFound desc = could not find container \"fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634\": container with ID starting with fb2a0803ed9a752f19a81e6a322a322da3ac453a7dd04bc68183ecfcc0eb9634 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.118355 4681 scope.go:117] "RemoveContainer" containerID="e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.118952 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786"} err="failed to get container status \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": rpc error: code = NotFound desc = could not find container \"e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786\": container with ID starting with e7f42fd2a31ccf799f0a8b7125466e55e31ae8855bdf1908c63d41109a3cf786 not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.118995 4681 scope.go:117] "RemoveContainer" containerID="4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.119568 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc"} err="failed to get container status \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": rpc error: code = NotFound desc = could not find container \"4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc\": container with ID starting with 4d52f1de932b399a4e87ce0cde63b76800a44a1fa91e81155daea0a71c63bccc not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.119596 4681 scope.go:117] "RemoveContainer" containerID="0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.119965 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a"} err="failed to get container status \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": rpc error: code = NotFound desc = could not find container \"0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a\": container with ID starting with 0adc26eba13fc6398fd429678462f0426fd54593eada2a828cc2d63c7faef36a not found: ID does not exist" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.170708 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.465999 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0" path="/var/lib/kubelet/pods/3e5bdf3a-d05e-4ef5-ada6-301e7e7e49c0/volumes" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.874082 4681 generic.go:334] "Generic (PLEG): container finished" podID="7652cec3-8eaa-46df-a209-7babb32e5ec3" containerID="e99a260e85a3a06b823325241040fb7c7b216a032b385e99c0f50f8c33f0fb12" exitCode=0 Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.874174 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerDied","Data":"e99a260e85a3a06b823325241040fb7c7b216a032b385e99c0f50f8c33f0fb12"} Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.874223 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"aa4f11a84b29f31e635dc0b2572e2ef96fa0a4f02cfb7d190a7065c48e1abd8a"} Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.880289 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xpdjl_1976858f-1664-4b36-9929-65cc8fe9d0ad/kube-multus/0.log" Jan 22 09:11:27 crc kubenswrapper[4681]: I0122 09:11:27.880674 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xpdjl" event={"ID":"1976858f-1664-4b36-9929-65cc8fe9d0ad","Type":"ContainerStarted","Data":"9396881dcdcd265c9082693b8c71ff3c4b5e8eff0945d46302936749543609d3"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888679 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"4a643f773694e969ed21ada40b6eafd44ed6cb60fc84c1009ec0b549008df011"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888727 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"1992d7cd4fd876a43d7108fbe5899e5690d4c5843c692b2d0672d5539b9ec1c8"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888742 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"4eea37a969f9e8f4dd35b74e26e0ac04e094c53e06a14ed94f4e1c61eb44c412"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888754 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"cab8676dbd4ccf940a502b217b89af43e1e0cb43f5b9e0534dad36da01d95d50"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888766 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"d61475b3bf3b340405aadeae5db4fa97e46c378b1fb8b879efbd0bcd5428a14b"} Jan 22 09:11:28 crc kubenswrapper[4681]: I0122 09:11:28.888776 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"85ec371d47a79bc0b1824dd0246719e192990045e17a011787614195ef03fedc"} Jan 22 09:11:29 crc kubenswrapper[4681]: I0122 09:11:29.916457 4681 scope.go:117] "RemoveContainer" containerID="dd2b558abda30b4f3c000ef8bffe9f74c225b303f83624948a59d663930bf802" Jan 22 09:11:29 crc kubenswrapper[4681]: I0122 09:11:29.940183 4681 scope.go:117] "RemoveContainer" containerID="366a4e110862a966430b777c8ce0cf25984ff4176216fe69159c26d0a936a413" Jan 22 09:11:29 crc kubenswrapper[4681]: I0122 09:11:29.960790 4681 scope.go:117] "RemoveContainer" containerID="a1f4dbe1997715334cb994382ab85cff0ae50d240e204636f4a94e6021907030" Jan 22 09:11:29 crc kubenswrapper[4681]: I0122 09:11:29.985073 4681 scope.go:117] "RemoveContainer" containerID="4d0d36534e5b37497c4d7818cde74400be04e5592041ac9bd92891a5cde0fcd5" Jan 22 09:11:31 crc kubenswrapper[4681]: I0122 09:11:31.915890 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"3afef9b21e873c11072357ee2c9fba25325eacb6267e60d81ac23d7b78b832cf"} Jan 22 09:11:33 crc kubenswrapper[4681]: I0122 09:11:33.933746 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" event={"ID":"7652cec3-8eaa-46df-a209-7babb32e5ec3","Type":"ContainerStarted","Data":"b430ae0ca1d6b40979f55e945500447df875ced4fb6f49512847bc42b5e40e45"} Jan 22 09:11:34 crc kubenswrapper[4681]: I0122 09:11:34.938172 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:34 crc kubenswrapper[4681]: I0122 09:11:34.938216 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:34 crc kubenswrapper[4681]: I0122 09:11:34.938256 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:34 crc kubenswrapper[4681]: I0122 09:11:34.972612 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:34 crc kubenswrapper[4681]: I0122 09:11:34.992564 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" podStartSLOduration=8.992541363 podStartE2EDuration="8.992541363s" podCreationTimestamp="2026-01-22 09:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:11:34.988620958 +0000 UTC m=+485.814531483" watchObservedRunningTime="2026-01-22 09:11:34.992541363 +0000 UTC m=+485.818451908" Jan 22 09:11:35 crc kubenswrapper[4681]: I0122 09:11:35.015213 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:11:56 crc kubenswrapper[4681]: I0122 09:11:56.032938 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:11:56 crc kubenswrapper[4681]: I0122 09:11:56.033772 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:11:57 crc kubenswrapper[4681]: I0122 09:11:57.210201 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jl5g2" Jan 22 09:12:26 crc kubenswrapper[4681]: I0122 09:12:26.031846 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:12:26 crc kubenswrapper[4681]: I0122 09:12:26.032512 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.039813 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.043010 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zz8lv" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="registry-server" containerID="cri-o://e21c61baaab33aeacadf349c125a6f3f3d923c5ac12d42ee6b26b6e9cdc7b2aa" gracePeriod=30 Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.402831 4681 generic.go:334] "Generic (PLEG): container finished" podID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerID="e21c61baaab33aeacadf349c125a6f3f3d923c5ac12d42ee6b26b6e9cdc7b2aa" exitCode=0 Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.402917 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerDied","Data":"e21c61baaab33aeacadf349c125a6f3f3d923c5ac12d42ee6b26b6e9cdc7b2aa"} Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.403227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz8lv" event={"ID":"5dd42106-064b-47b9-8d13-19d1e5ed4959","Type":"ContainerDied","Data":"133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc"} Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.403241 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133decd1b6879aa2a8792da2ed91c7f56968a4f14196934b0cbb1ad1f5e1ffbc" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.417540 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.491299 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzsjl\" (UniqueName: \"kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl\") pod \"5dd42106-064b-47b9-8d13-19d1e5ed4959\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.491365 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content\") pod \"5dd42106-064b-47b9-8d13-19d1e5ed4959\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.501584 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl" (OuterVolumeSpecName: "kube-api-access-mzsjl") pod "5dd42106-064b-47b9-8d13-19d1e5ed4959" (UID: "5dd42106-064b-47b9-8d13-19d1e5ed4959"). InnerVolumeSpecName "kube-api-access-mzsjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.515277 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dd42106-064b-47b9-8d13-19d1e5ed4959" (UID: "5dd42106-064b-47b9-8d13-19d1e5ed4959"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.592464 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities\") pod \"5dd42106-064b-47b9-8d13-19d1e5ed4959\" (UID: \"5dd42106-064b-47b9-8d13-19d1e5ed4959\") " Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.592830 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.592864 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzsjl\" (UniqueName: \"kubernetes.io/projected/5dd42106-064b-47b9-8d13-19d1e5ed4959-kube-api-access-mzsjl\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.593123 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities" (OuterVolumeSpecName: "utilities") pod "5dd42106-064b-47b9-8d13-19d1e5ed4959" (UID: "5dd42106-064b-47b9-8d13-19d1e5ed4959"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:33 crc kubenswrapper[4681]: I0122 09:12:33.694138 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd42106-064b-47b9-8d13-19d1e5ed4959-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:34 crc kubenswrapper[4681]: I0122 09:12:34.409579 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz8lv" Jan 22 09:12:34 crc kubenswrapper[4681]: I0122 09:12:34.454040 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:12:34 crc kubenswrapper[4681]: I0122 09:12:34.461148 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz8lv"] Jan 22 09:12:35 crc kubenswrapper[4681]: I0122 09:12:35.462117 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" path="/var/lib/kubelet/pods/5dd42106-064b-47b9-8d13-19d1e5ed4959/volumes" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.025642 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv"] Jan 22 09:12:37 crc kubenswrapper[4681]: E0122 09:12:37.025904 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="extract-utilities" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.025920 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="extract-utilities" Jan 22 09:12:37 crc kubenswrapper[4681]: E0122 09:12:37.025947 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="registry-server" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.025957 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="registry-server" Jan 22 09:12:37 crc kubenswrapper[4681]: E0122 09:12:37.025968 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="extract-content" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.025975 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="extract-content" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.026085 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd42106-064b-47b9-8d13-19d1e5ed4959" containerName="registry-server" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.026939 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.030074 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.037549 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.037644 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j65xh\" (UniqueName: \"kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.037680 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.049255 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv"] Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.139388 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.139887 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j65xh\" (UniqueName: \"kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.139929 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.140240 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.140649 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.175013 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j65xh\" (UniqueName: \"kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.343567 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:37 crc kubenswrapper[4681]: I0122 09:12:37.582805 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv"] Jan 22 09:12:37 crc kubenswrapper[4681]: W0122 09:12:37.590749 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a0c173_d88f_42a7_8e5a_c8bf03be0c7c.slice/crio-7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973 WatchSource:0}: Error finding container 7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973: Status 404 returned error can't find the container with id 7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973 Jan 22 09:12:38 crc kubenswrapper[4681]: I0122 09:12:38.440715 4681 generic.go:334] "Generic (PLEG): container finished" podID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerID="489adf6767f959a4209bbf503c5d2c6b410f8c9ea7f965b4ff317a484d31059b" exitCode=0 Jan 22 09:12:38 crc kubenswrapper[4681]: I0122 09:12:38.440799 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" event={"ID":"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c","Type":"ContainerDied","Data":"489adf6767f959a4209bbf503c5d2c6b410f8c9ea7f965b4ff317a484d31059b"} Jan 22 09:12:38 crc kubenswrapper[4681]: I0122 09:12:38.440855 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" event={"ID":"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c","Type":"ContainerStarted","Data":"7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973"} Jan 22 09:12:38 crc kubenswrapper[4681]: I0122 09:12:38.444963 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:12:40 crc kubenswrapper[4681]: I0122 09:12:40.456675 4681 generic.go:334] "Generic (PLEG): container finished" podID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerID="a898ffe510e8c00bdc759eacb47ec1f81e9376acaba2b4c1bff341fc17fd41ce" exitCode=0 Jan 22 09:12:40 crc kubenswrapper[4681]: I0122 09:12:40.456824 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" event={"ID":"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c","Type":"ContainerDied","Data":"a898ffe510e8c00bdc759eacb47ec1f81e9376acaba2b4c1bff341fc17fd41ce"} Jan 22 09:12:41 crc kubenswrapper[4681]: I0122 09:12:41.467723 4681 generic.go:334] "Generic (PLEG): container finished" podID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerID="32349d5d94e337d64ba517a7965401a9ff65dcac9a8b167bdff576262a36aa90" exitCode=0 Jan 22 09:12:41 crc kubenswrapper[4681]: I0122 09:12:41.467784 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" event={"ID":"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c","Type":"ContainerDied","Data":"32349d5d94e337d64ba517a7965401a9ff65dcac9a8b167bdff576262a36aa90"} Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.790343 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.944847 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j65xh\" (UniqueName: \"kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh\") pod \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.944923 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util\") pod \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.944994 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle\") pod \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\" (UID: \"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c\") " Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.949239 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle" (OuterVolumeSpecName: "bundle") pod "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" (UID: "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.953137 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh" (OuterVolumeSpecName: "kube-api-access-j65xh") pod "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" (UID: "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c"). InnerVolumeSpecName "kube-api-access-j65xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:12:42 crc kubenswrapper[4681]: I0122 09:12:42.967249 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util" (OuterVolumeSpecName: "util") pod "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" (UID: "b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.046866 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.046913 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.046933 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j65xh\" (UniqueName: \"kubernetes.io/projected/b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c-kube-api-access-j65xh\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.486187 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" event={"ID":"b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c","Type":"ContainerDied","Data":"7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973"} Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.486281 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d98bf95aa50a9939ceddcf3fb73f5e9c480fa6b5374ca2de7fed05263dec973" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.486391 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.621807 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g"] Jan 22 09:12:43 crc kubenswrapper[4681]: E0122 09:12:43.622133 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="util" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.622161 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="util" Jan 22 09:12:43 crc kubenswrapper[4681]: E0122 09:12:43.622189 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="pull" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.622202 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="pull" Jan 22 09:12:43 crc kubenswrapper[4681]: E0122 09:12:43.622219 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="extract" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.622232 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="extract" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.622425 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c" containerName="extract" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.623662 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.628810 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.633099 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g"] Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.655523 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j2c6\" (UniqueName: \"kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.655882 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.656074 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.757868 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.757969 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.758066 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j2c6\" (UniqueName: \"kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.758850 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.759592 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.788606 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j2c6\" (UniqueName: \"kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:43 crc kubenswrapper[4681]: I0122 09:12:43.941197 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:44 crc kubenswrapper[4681]: I0122 09:12:44.246092 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g"] Jan 22 09:12:44 crc kubenswrapper[4681]: W0122 09:12:44.252614 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab04fd7_466e_4cca_9b7e_55e45896f903.slice/crio-442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf WatchSource:0}: Error finding container 442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf: Status 404 returned error can't find the container with id 442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf Jan 22 09:12:44 crc kubenswrapper[4681]: I0122 09:12:44.494624 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerStarted","Data":"c590c8ee812e127576680f3dcb0ad2a84b5efe0eb65b08d0d944d9dc1bd9d6f0"} Jan 22 09:12:44 crc kubenswrapper[4681]: I0122 09:12:44.494864 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerStarted","Data":"442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf"} Jan 22 09:12:45 crc kubenswrapper[4681]: I0122 09:12:45.508645 4681 generic.go:334] "Generic (PLEG): container finished" podID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerID="c590c8ee812e127576680f3dcb0ad2a84b5efe0eb65b08d0d944d9dc1bd9d6f0" exitCode=0 Jan 22 09:12:45 crc kubenswrapper[4681]: I0122 09:12:45.508711 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerDied","Data":"c590c8ee812e127576680f3dcb0ad2a84b5efe0eb65b08d0d944d9dc1bd9d6f0"} Jan 22 09:12:47 crc kubenswrapper[4681]: I0122 09:12:47.521453 4681 generic.go:334] "Generic (PLEG): container finished" podID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerID="11a1dc7afdbb134eba62de68ab275107a75dfb0604611b37d2205c8d5f999ed7" exitCode=0 Jan 22 09:12:47 crc kubenswrapper[4681]: I0122 09:12:47.521507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerDied","Data":"11a1dc7afdbb134eba62de68ab275107a75dfb0604611b37d2205c8d5f999ed7"} Jan 22 09:12:48 crc kubenswrapper[4681]: I0122 09:12:48.528478 4681 generic.go:334] "Generic (PLEG): container finished" podID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerID="f661bb970744e4a2f0274e8cfce8adf2e1a16c0b7840f0fcbeb3d227bce07afc" exitCode=0 Jan 22 09:12:48 crc kubenswrapper[4681]: I0122 09:12:48.528543 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerDied","Data":"f661bb970744e4a2f0274e8cfce8adf2e1a16c0b7840f0fcbeb3d227bce07afc"} Jan 22 09:12:48 crc kubenswrapper[4681]: I0122 09:12:48.853213 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2"] Jan 22 09:12:48 crc kubenswrapper[4681]: I0122 09:12:48.854190 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:48 crc kubenswrapper[4681]: I0122 09:12:48.869040 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2"] Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.021972 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zjn\" (UniqueName: \"kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.022085 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.022181 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.122878 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zjn\" (UniqueName: \"kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.122932 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.122968 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.123544 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.123639 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.142967 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zjn\" (UniqueName: \"kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.166539 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.483614 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2"] Jan 22 09:12:49 crc kubenswrapper[4681]: W0122 09:12:49.496145 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e7ae8ed_882c_4537_9699_344ae1d6fa06.slice/crio-c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906 WatchSource:0}: Error finding container c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906: Status 404 returned error can't find the container with id c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906 Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.538732 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerStarted","Data":"c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906"} Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.750819 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.934774 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle\") pod \"bab04fd7-466e-4cca-9b7e-55e45896f903\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.935004 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j2c6\" (UniqueName: \"kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6\") pod \"bab04fd7-466e-4cca-9b7e-55e45896f903\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.935063 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util\") pod \"bab04fd7-466e-4cca-9b7e-55e45896f903\" (UID: \"bab04fd7-466e-4cca-9b7e-55e45896f903\") " Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.935686 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle" (OuterVolumeSpecName: "bundle") pod "bab04fd7-466e-4cca-9b7e-55e45896f903" (UID: "bab04fd7-466e-4cca-9b7e-55e45896f903"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.940907 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6" (OuterVolumeSpecName: "kube-api-access-7j2c6") pod "bab04fd7-466e-4cca-9b7e-55e45896f903" (UID: "bab04fd7-466e-4cca-9b7e-55e45896f903"). InnerVolumeSpecName "kube-api-access-7j2c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:12:49 crc kubenswrapper[4681]: I0122 09:12:49.957933 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util" (OuterVolumeSpecName: "util") pod "bab04fd7-466e-4cca-9b7e-55e45896f903" (UID: "bab04fd7-466e-4cca-9b7e-55e45896f903"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.036319 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j2c6\" (UniqueName: \"kubernetes.io/projected/bab04fd7-466e-4cca-9b7e-55e45896f903-kube-api-access-7j2c6\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.036352 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.036363 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab04fd7-466e-4cca-9b7e-55e45896f903-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.568798 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" event={"ID":"bab04fd7-466e-4cca-9b7e-55e45896f903","Type":"ContainerDied","Data":"442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf"} Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.568838 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442bd654c48a27dc8c0f7172fee1e278df7354dd3ab61b6e609b9fc618c11edf" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.568914 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g" Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.579080 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerID="168e402a5c629d42adca1496fc10c414e1df2e642a9cea009b028a0e5af66daa" exitCode=0 Jan 22 09:12:50 crc kubenswrapper[4681]: I0122 09:12:50.579124 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerDied","Data":"168e402a5c629d42adca1496fc10c414e1df2e642a9cea009b028a0e5af66daa"} Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.048821 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp"] Jan 22 09:12:54 crc kubenswrapper[4681]: E0122 09:12:54.049636 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="extract" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.049653 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="extract" Jan 22 09:12:54 crc kubenswrapper[4681]: E0122 09:12:54.049665 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="pull" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.049674 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="pull" Jan 22 09:12:54 crc kubenswrapper[4681]: E0122 09:12:54.049687 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="util" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.049694 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="util" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.049815 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab04fd7-466e-4cca-9b7e-55e45896f903" containerName="extract" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.050226 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.054800 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.054851 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.054971 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-kdpdm" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.062731 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.082058 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7md\" (UniqueName: \"kubernetes.io/projected/a1a87f01-0828-4b50-9567-3e88120e3de6-kube-api-access-wr7md\") pod \"obo-prometheus-operator-68bc856cb9-6zkmp\" (UID: \"a1a87f01-0828-4b50-9567-3e88120e3de6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.165635 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.166448 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.168311 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.168434 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-5tghf" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.174170 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.174834 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.183576 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7md\" (UniqueName: \"kubernetes.io/projected/a1a87f01-0828-4b50-9567-3e88120e3de6-kube-api-access-wr7md\") pod \"obo-prometheus-operator-68bc856cb9-6zkmp\" (UID: \"a1a87f01-0828-4b50-9567-3e88120e3de6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.215163 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.224383 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7md\" (UniqueName: \"kubernetes.io/projected/a1a87f01-0828-4b50-9567-3e88120e3de6-kube-api-access-wr7md\") pod \"obo-prometheus-operator-68bc856cb9-6zkmp\" (UID: \"a1a87f01-0828-4b50-9567-3e88120e3de6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.244601 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.283283 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8ltql"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.284194 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.284966 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.285055 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.288540 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqps4\" (UniqueName: \"kubernetes.io/projected/1928d100-6670-42d8-898f-6102dfbfee50-kube-api-access-tqps4\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.288587 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.288629 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1928d100-6670-42d8-898f-6102dfbfee50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.288696 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.286061 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gxqwz" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.286154 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.297969 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8ltql"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.364458 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.374285 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xx7ql"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.375104 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.379638 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zlrzs" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390234 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zpd\" (UniqueName: \"kubernetes.io/projected/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-kube-api-access-c6zpd\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390333 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390378 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqps4\" (UniqueName: \"kubernetes.io/projected/1928d100-6670-42d8-898f-6102dfbfee50-kube-api-access-tqps4\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390396 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390422 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1928d100-6670-42d8-898f-6102dfbfee50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390467 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390497 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.390534 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.397723 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.402655 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1928d100-6670-42d8-898f-6102dfbfee50-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.403398 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xx7ql"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.404837 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/91e0d48d-41f3-469c-8743-00b4ee3cdc94-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp\" (UID: \"91e0d48d-41f3-469c-8743-00b4ee3cdc94\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.408222 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.416114 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/baa6f894-0e43-4f0a-ba66-a9dd75edd31f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-547cd868c5-zm777\" (UID: \"baa6f894-0e43-4f0a-ba66-a9dd75edd31f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.418007 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqps4\" (UniqueName: \"kubernetes.io/projected/1928d100-6670-42d8-898f-6102dfbfee50-kube-api-access-tqps4\") pod \"observability-operator-59bdc8b94-8ltql\" (UID: \"1928d100-6670-42d8-898f-6102dfbfee50\") " pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.494707 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.494818 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zpd\" (UniqueName: \"kubernetes.io/projected/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-kube-api-access-c6zpd\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.496585 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.503598 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.507530 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.537995 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zpd\" (UniqueName: \"kubernetes.io/projected/fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a-kube-api-access-c6zpd\") pod \"perses-operator-5bf474d74f-xx7ql\" (UID: \"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a\") " pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.610980 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.700697 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerID="f7e67d95cc09af1efad7e439c9de38757c3f47df45c72f8183e1e6b0c1613df0" exitCode=0 Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.700745 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerDied","Data":"f7e67d95cc09af1efad7e439c9de38757c3f47df45c72f8183e1e6b0c1613df0"} Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.719346 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp"] Jan 22 09:12:54 crc kubenswrapper[4681]: I0122 09:12:54.739508 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.154019 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8ltql"] Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.173172 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777"] Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.190807 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-xx7ql"] Jan 22 09:12:55 crc kubenswrapper[4681]: W0122 09:12:55.199227 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab6bf9a_3bb4_4bc8_a80e_1cd0ab9d6d2a.slice/crio-f1c9f6820dcbacb4f7601c2874f919a7d8aed592539b9a108965c4b38b71644a WatchSource:0}: Error finding container f1c9f6820dcbacb4f7601c2874f919a7d8aed592539b9a108965c4b38b71644a: Status 404 returned error can't find the container with id f1c9f6820dcbacb4f7601c2874f919a7d8aed592539b9a108965c4b38b71644a Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.237414 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp"] Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.707999 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" event={"ID":"baa6f894-0e43-4f0a-ba66-a9dd75edd31f","Type":"ContainerStarted","Data":"0e695636d338c6f91cf74b00ca89024262cf7491130f5ca2871f40d820939a6f"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.709293 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" event={"ID":"a1a87f01-0828-4b50-9567-3e88120e3de6","Type":"ContainerStarted","Data":"5ecc003a11bbcb70de9c37fa0a5fc32cd3b4cb7f651f87f921981f76abdaf6ae"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.710670 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" event={"ID":"1928d100-6670-42d8-898f-6102dfbfee50","Type":"ContainerStarted","Data":"8328bbe6270702b20ccc5e12576ebe35d6a21ea951bd759cca1bebd209ca6abc"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.711886 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" event={"ID":"91e0d48d-41f3-469c-8743-00b4ee3cdc94","Type":"ContainerStarted","Data":"d092acbfae4620a6ee3040f04d3757a71b1031fe2585e0a77bded72ef86267ee"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.714328 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerStarted","Data":"935ac9b38e5e71de2f18fde0b8fcaaf9f8240023fa81edf044e89abdbf9eb818"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.715314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" event={"ID":"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a","Type":"ContainerStarted","Data":"f1c9f6820dcbacb4f7601c2874f919a7d8aed592539b9a108965c4b38b71644a"} Jan 22 09:12:55 crc kubenswrapper[4681]: I0122 09:12:55.737063 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" podStartSLOduration=4.462387584 podStartE2EDuration="7.737043881s" podCreationTimestamp="2026-01-22 09:12:48 +0000 UTC" firstStartedPulling="2026-01-22 09:12:50.581714232 +0000 UTC m=+561.407624737" lastFinishedPulling="2026-01-22 09:12:53.856370529 +0000 UTC m=+564.682281034" observedRunningTime="2026-01-22 09:12:55.73243217 +0000 UTC m=+566.558342695" watchObservedRunningTime="2026-01-22 09:12:55.737043881 +0000 UTC m=+566.562954396" Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.031238 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.031369 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.031448 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.032255 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.032428 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce" gracePeriod=600 Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.726903 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce" exitCode=0 Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.726959 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce"} Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.726986 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c"} Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.727003 4681 scope.go:117] "RemoveContainer" containerID="b4ec2fa3e3aaacfce78c09a1f6c50b2addaa48c1eb65926acc4c02cf4a2b90d9" Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.730636 4681 generic.go:334] "Generic (PLEG): container finished" podID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerID="935ac9b38e5e71de2f18fde0b8fcaaf9f8240023fa81edf044e89abdbf9eb818" exitCode=0 Jan 22 09:12:56 crc kubenswrapper[4681]: I0122 09:12:56.730668 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerDied","Data":"935ac9b38e5e71de2f18fde0b8fcaaf9f8240023fa81edf044e89abdbf9eb818"} Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.000217 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.151388 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util\") pod \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.151453 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle\") pod \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.151500 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zjn\" (UniqueName: \"kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn\") pod \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\" (UID: \"3e7ae8ed-882c-4537-9699-344ae1d6fa06\") " Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.153376 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle" (OuterVolumeSpecName: "bundle") pod "3e7ae8ed-882c-4537-9699-344ae1d6fa06" (UID: "3e7ae8ed-882c-4537-9699-344ae1d6fa06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.171600 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn" (OuterVolumeSpecName: "kube-api-access-k6zjn") pod "3e7ae8ed-882c-4537-9699-344ae1d6fa06" (UID: "3e7ae8ed-882c-4537-9699-344ae1d6fa06"). InnerVolumeSpecName "kube-api-access-k6zjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.178139 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util" (OuterVolumeSpecName: "util") pod "3e7ae8ed-882c-4537-9699-344ae1d6fa06" (UID: "3e7ae8ed-882c-4537-9699-344ae1d6fa06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.259922 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.259956 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e7ae8ed-882c-4537-9699-344ae1d6fa06-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.259968 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zjn\" (UniqueName: \"kubernetes.io/projected/3e7ae8ed-882c-4537-9699-344ae1d6fa06-kube-api-access-k6zjn\") on node \"crc\" DevicePath \"\"" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.589331 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-577ff7768c-mpfxh"] Jan 22 09:12:58 crc kubenswrapper[4681]: E0122 09:12:58.589538 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="pull" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.589549 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="pull" Jan 22 09:12:58 crc kubenswrapper[4681]: E0122 09:12:58.589560 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="extract" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.589566 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="extract" Jan 22 09:12:58 crc kubenswrapper[4681]: E0122 09:12:58.589574 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="util" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.589580 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="util" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.589674 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7ae8ed-882c-4537-9699-344ae1d6fa06" containerName="extract" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.590030 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.591680 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.591893 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.592011 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-b5qvb" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.594129 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.605903 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-577ff7768c-mpfxh"] Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.756418 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" event={"ID":"3e7ae8ed-882c-4537-9699-344ae1d6fa06","Type":"ContainerDied","Data":"c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906"} Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.756665 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92d1edd281d14fa22808ae227eed74768feb828ed72f33768ab29c1e9668906" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.756480 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.771882 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbltt\" (UniqueName: \"kubernetes.io/projected/c72b89e7-f149-4a53-b660-54ca9f4cf900-kube-api-access-pbltt\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.771985 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-webhook-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.772042 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-apiservice-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.873288 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbltt\" (UniqueName: \"kubernetes.io/projected/c72b89e7-f149-4a53-b660-54ca9f4cf900-kube-api-access-pbltt\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.873361 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-webhook-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.873393 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-apiservice-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.877922 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-webhook-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.887828 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbltt\" (UniqueName: \"kubernetes.io/projected/c72b89e7-f149-4a53-b660-54ca9f4cf900-kube-api-access-pbltt\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.892454 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c72b89e7-f149-4a53-b660-54ca9f4cf900-apiservice-cert\") pod \"elastic-operator-577ff7768c-mpfxh\" (UID: \"c72b89e7-f149-4a53-b660-54ca9f4cf900\") " pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:12:58 crc kubenswrapper[4681]: I0122 09:12:58.907064 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" Jan 22 09:13:08 crc kubenswrapper[4681]: E0122 09:13:08.443600 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Jan 22 09:13:08 crc kubenswrapper[4681]: E0122 09:13:08.445196 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp_openshift-operators(91e0d48d-41f3-469c-8743-00b4ee3cdc94): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:13:08 crc kubenswrapper[4681]: E0122 09:13:08.446440 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" podUID="91e0d48d-41f3-469c-8743-00b4ee3cdc94" Jan 22 09:13:08 crc kubenswrapper[4681]: E0122 09:13:08.809574 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" podUID="91e0d48d-41f3-469c-8743-00b4ee3cdc94" Jan 22 09:13:09 crc kubenswrapper[4681]: E0122 09:13:09.312031 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Jan 22 09:13:09 crc kubenswrapper[4681]: E0122 09:13:09.312476 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wr7md,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-6zkmp_openshift-operators(a1a87f01-0828-4b50-9567-3e88120e3de6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 22 09:13:09 crc kubenswrapper[4681]: E0122 09:13:09.314570 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" podUID="a1a87f01-0828-4b50-9567-3e88120e3de6" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.499704 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-577ff7768c-mpfxh"] Jan 22 09:13:09 crc kubenswrapper[4681]: W0122 09:13:09.503840 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72b89e7_f149_4a53_b660_54ca9f4cf900.slice/crio-f77540176390b410d608f4cd1b755e58d130187738256966629f1e7d7dff1704 WatchSource:0}: Error finding container f77540176390b410d608f4cd1b755e58d130187738256966629f1e7d7dff1704: Status 404 returned error can't find the container with id f77540176390b410d608f4cd1b755e58d130187738256966629f1e7d7dff1704 Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.813443 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" event={"ID":"c72b89e7-f149-4a53-b660-54ca9f4cf900","Type":"ContainerStarted","Data":"f77540176390b410d608f4cd1b755e58d130187738256966629f1e7d7dff1704"} Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.815440 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" event={"ID":"fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a","Type":"ContainerStarted","Data":"0ed1fcdb9a92320ce42b0df56afd56be24043c9426d5cdc77a368c3738a1ecad"} Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.815726 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.816884 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" event={"ID":"baa6f894-0e43-4f0a-ba66-a9dd75edd31f","Type":"ContainerStarted","Data":"fe855beb9c9fbd632a8b0633dcb0cc28fc92231c57e33361d485246c55d1d0f9"} Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.818765 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" event={"ID":"1928d100-6670-42d8-898f-6102dfbfee50","Type":"ContainerStarted","Data":"35fdf819bf302909205d5cd62607c1c9bf16e0ec9365c455c40ffd2b4063aa25"} Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.819169 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:13:09 crc kubenswrapper[4681]: E0122 09:13:09.819717 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" podUID="a1a87f01-0828-4b50-9567-3e88120e3de6" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.832701 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" podStartSLOduration=1.71368684 podStartE2EDuration="15.832681437s" podCreationTimestamp="2026-01-22 09:12:54 +0000 UTC" firstStartedPulling="2026-01-22 09:12:55.202581161 +0000 UTC m=+566.028491666" lastFinishedPulling="2026-01-22 09:13:09.321575758 +0000 UTC m=+580.147486263" observedRunningTime="2026-01-22 09:13:09.830001887 +0000 UTC m=+580.655912392" watchObservedRunningTime="2026-01-22 09:13:09.832681437 +0000 UTC m=+580.658591942" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.860633 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.900506 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8ltql" podStartSLOduration=1.7219491850000002 podStartE2EDuration="15.900479704s" podCreationTimestamp="2026-01-22 09:12:54 +0000 UTC" firstStartedPulling="2026-01-22 09:12:55.173434136 +0000 UTC m=+565.999344641" lastFinishedPulling="2026-01-22 09:13:09.351964645 +0000 UTC m=+580.177875160" observedRunningTime="2026-01-22 09:13:09.898071661 +0000 UTC m=+580.723982196" watchObservedRunningTime="2026-01-22 09:13:09.900479704 +0000 UTC m=+580.726390249" Jan 22 09:13:09 crc kubenswrapper[4681]: I0122 09:13:09.906019 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-zm777" podStartSLOduration=1.765567109 podStartE2EDuration="15.906004208s" podCreationTimestamp="2026-01-22 09:12:54 +0000 UTC" firstStartedPulling="2026-01-22 09:12:55.185526814 +0000 UTC m=+566.011437319" lastFinishedPulling="2026-01-22 09:13:09.325963903 +0000 UTC m=+580.151874418" observedRunningTime="2026-01-22 09:13:09.860225749 +0000 UTC m=+580.686136254" watchObservedRunningTime="2026-01-22 09:13:09.906004208 +0000 UTC m=+580.731914743" Jan 22 09:13:12 crc kubenswrapper[4681]: I0122 09:13:12.836128 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" event={"ID":"c72b89e7-f149-4a53-b660-54ca9f4cf900","Type":"ContainerStarted","Data":"d27a8561faff298eb2a52b0a91fcc7e45cd371f731f013523fdeeca8fc3696f4"} Jan 22 09:13:12 crc kubenswrapper[4681]: I0122 09:13:12.879732 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-577ff7768c-mpfxh" podStartSLOduration=12.418000482 podStartE2EDuration="14.879717847s" podCreationTimestamp="2026-01-22 09:12:58 +0000 UTC" firstStartedPulling="2026-01-22 09:13:09.50667482 +0000 UTC m=+580.332585325" lastFinishedPulling="2026-01-22 09:13:11.968392175 +0000 UTC m=+582.794302690" observedRunningTime="2026-01-22 09:13:12.877328934 +0000 UTC m=+583.703239439" watchObservedRunningTime="2026-01-22 09:13:12.879717847 +0000 UTC m=+583.705628352" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.134118 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.135175 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.139187 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.139360 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.139523 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.140544 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.140667 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.140867 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-q8x79" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.141061 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.145514 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.145649 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.197904 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/44b13aea-bee1-4576-87b4-d41165fcc2fa-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.197948 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198095 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198144 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198175 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198203 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198269 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198300 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198341 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.198374 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.210565 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299340 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299400 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299439 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299468 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299502 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299603 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299624 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/44b13aea-bee1-4576-87b4-d41165fcc2fa-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299645 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299662 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299721 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299751 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299778 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299934 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.299973 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.300008 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.300197 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.300567 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.300636 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.301750 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.305151 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.305173 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.305174 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.305724 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/44b13aea-bee1-4576-87b4-d41165fcc2fa-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.309800 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.310527 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.400817 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.400876 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.400929 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.400964 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.400993 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.401482 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.401961 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.402357 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.402372 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/44b13aea-bee1-4576-87b4-d41165fcc2fa-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.404951 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/44b13aea-bee1-4576-87b4-d41165fcc2fa-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"44b13aea-bee1-4576-87b4-d41165fcc2fa\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.451650 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:13 crc kubenswrapper[4681]: I0122 09:13:13.668193 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 22 09:13:13 crc kubenswrapper[4681]: W0122 09:13:13.675308 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b13aea_bee1_4576_87b4_d41165fcc2fa.slice/crio-8ee30da396258c418ebefbd4367bbabb37c17f61842d8a690daf4766098b5bbf WatchSource:0}: Error finding container 8ee30da396258c418ebefbd4367bbabb37c17f61842d8a690daf4766098b5bbf: Status 404 returned error can't find the container with id 8ee30da396258c418ebefbd4367bbabb37c17f61842d8a690daf4766098b5bbf Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.044249 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"44b13aea-bee1-4576-87b4-d41165fcc2fa","Type":"ContainerStarted","Data":"8ee30da396258c418ebefbd4367bbabb37c17f61842d8a690daf4766098b5bbf"} Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.343399 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f"] Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.344068 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.347720 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.347767 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.347806 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-bls2k" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.370116 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f"] Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.447994 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72nx\" (UniqueName: \"kubernetes.io/projected/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-kube-api-access-j72nx\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.448196 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.550479 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72nx\" (UniqueName: \"kubernetes.io/projected/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-kube-api-access-j72nx\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.551362 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.551491 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.586493 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72nx\" (UniqueName: \"kubernetes.io/projected/149ecf59-afc7-4e9c-8f5b-3e9b4aa43360-kube-api-access-j72nx\") pod \"cert-manager-operator-controller-manager-5446d6888b-8v77f\" (UID: \"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.672024 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.744034 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-xx7ql" Jan 22 09:13:14 crc kubenswrapper[4681]: I0122 09:13:14.906097 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f"] Jan 22 09:13:14 crc kubenswrapper[4681]: W0122 09:13:14.935511 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149ecf59_afc7_4e9c_8f5b_3e9b4aa43360.slice/crio-183330ed04d9585e1ca9f33d808a4de2545cb0680685963ead69b387247432f3 WatchSource:0}: Error finding container 183330ed04d9585e1ca9f33d808a4de2545cb0680685963ead69b387247432f3: Status 404 returned error can't find the container with id 183330ed04d9585e1ca9f33d808a4de2545cb0680685963ead69b387247432f3 Jan 22 09:13:15 crc kubenswrapper[4681]: I0122 09:13:15.052235 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" event={"ID":"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360","Type":"ContainerStarted","Data":"183330ed04d9585e1ca9f33d808a4de2545cb0680685963ead69b387247432f3"} Jan 22 09:13:30 crc kubenswrapper[4681]: I0122 09:13:30.141984 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" event={"ID":"149ecf59-afc7-4e9c-8f5b-3e9b4aa43360","Type":"ContainerStarted","Data":"ec0cc9d384016a86b6c131be518cc79b92141cc81b335e0cae5598558490c58c"} Jan 22 09:13:30 crc kubenswrapper[4681]: I0122 09:13:30.143434 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" event={"ID":"91e0d48d-41f3-469c-8743-00b4ee3cdc94","Type":"ContainerStarted","Data":"f518e14f1d14176597570e9ef983674ab5b9a32c3290f6c623b2bb40887ef912"} Jan 22 09:13:30 crc kubenswrapper[4681]: I0122 09:13:30.173379 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-8v77f" podStartSLOduration=1.363866931 podStartE2EDuration="16.173350021s" podCreationTimestamp="2026-01-22 09:13:14 +0000 UTC" firstStartedPulling="2026-01-22 09:13:14.95152554 +0000 UTC m=+585.777436045" lastFinishedPulling="2026-01-22 09:13:29.76100864 +0000 UTC m=+600.586919135" observedRunningTime="2026-01-22 09:13:30.161534044 +0000 UTC m=+600.987444569" watchObservedRunningTime="2026-01-22 09:13:30.173350021 +0000 UTC m=+600.999260596" Jan 22 09:13:30 crc kubenswrapper[4681]: I0122 09:13:30.190169 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp" podStartSLOduration=-9223372000.66463 podStartE2EDuration="36.19014519s" podCreationTimestamp="2026-01-22 09:12:54 +0000 UTC" firstStartedPulling="2026-01-22 09:12:55.249928602 +0000 UTC m=+566.075839107" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:13:30.185186738 +0000 UTC m=+601.011097243" watchObservedRunningTime="2026-01-22 09:13:30.19014519 +0000 UTC m=+601.016055695" Jan 22 09:13:31 crc kubenswrapper[4681]: I0122 09:13:31.153063 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"44b13aea-bee1-4576-87b4-d41165fcc2fa","Type":"ContainerStarted","Data":"3d4e0e1e2192741eba48f09cfc41592ec99eb0583c66e7cbaa7726dcab49cfa8"} Jan 22 09:13:31 crc kubenswrapper[4681]: I0122 09:13:31.155709 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" event={"ID":"a1a87f01-0828-4b50-9567-3e88120e3de6","Type":"ContainerStarted","Data":"f75768855ba472f785dbe37fa2cd261650802388a1231f9335eb9e73cceec884"} Jan 22 09:13:31 crc kubenswrapper[4681]: I0122 09:13:31.237178 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-6zkmp" podStartSLOduration=2.193617349 podStartE2EDuration="37.237156285s" podCreationTimestamp="2026-01-22 09:12:54 +0000 UTC" firstStartedPulling="2026-01-22 09:12:54.823478522 +0000 UTC m=+565.649389027" lastFinishedPulling="2026-01-22 09:13:29.867017458 +0000 UTC m=+600.692927963" observedRunningTime="2026-01-22 09:13:31.232384677 +0000 UTC m=+602.058295222" watchObservedRunningTime="2026-01-22 09:13:31.237156285 +0000 UTC m=+602.063066800" Jan 22 09:13:31 crc kubenswrapper[4681]: I0122 09:13:31.320773 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 22 09:13:31 crc kubenswrapper[4681]: I0122 09:13:31.348092 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 22 09:13:33 crc kubenswrapper[4681]: I0122 09:13:33.167756 4681 generic.go:334] "Generic (PLEG): container finished" podID="44b13aea-bee1-4576-87b4-d41165fcc2fa" containerID="3d4e0e1e2192741eba48f09cfc41592ec99eb0583c66e7cbaa7726dcab49cfa8" exitCode=0 Jan 22 09:13:33 crc kubenswrapper[4681]: I0122 09:13:33.167901 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"44b13aea-bee1-4576-87b4-d41165fcc2fa","Type":"ContainerDied","Data":"3d4e0e1e2192741eba48f09cfc41592ec99eb0583c66e7cbaa7726dcab49cfa8"} Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.353196 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl"] Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.356334 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.358619 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.359489 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.360519 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-58xc9" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.369827 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl"] Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.404532 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.404640 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnld2\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-kube-api-access-bnld2\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.505835 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.505950 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnld2\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-kube-api-access-bnld2\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.536991 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnld2\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-kube-api-access-bnld2\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.542549 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p7hrl\" (UID: \"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.674033 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" Jan 22 09:13:35 crc kubenswrapper[4681]: I0122 09:13:35.935222 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl"] Jan 22 09:13:36 crc kubenswrapper[4681]: I0122 09:13:36.185097 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" event={"ID":"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2","Type":"ContainerStarted","Data":"2105ae2fd30e36f1d48960bdb74b651219c911841a3db9c02f6ad1eeeb109e5d"} Jan 22 09:13:36 crc kubenswrapper[4681]: I0122 09:13:36.187165 4681 generic.go:334] "Generic (PLEG): container finished" podID="44b13aea-bee1-4576-87b4-d41165fcc2fa" containerID="bb2bd42fb027173e408bec2a0c8a91fa1722cf9eb2cfba54841f1f6bfa1d3082" exitCode=0 Jan 22 09:13:36 crc kubenswrapper[4681]: I0122 09:13:36.187206 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"44b13aea-bee1-4576-87b4-d41165fcc2fa","Type":"ContainerDied","Data":"bb2bd42fb027173e408bec2a0c8a91fa1722cf9eb2cfba54841f1f6bfa1d3082"} Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.098339 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.101037 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.105655 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.111984 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.112148 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.112622 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.113150 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs2cb" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.115991 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.199506 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"44b13aea-bee1-4576-87b4-d41165fcc2fa","Type":"ContainerStarted","Data":"ce657f0965344a8eaebbcc185bd13024d5209e28c77f3e8e5c5ad65838ab7916"} Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.199784 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.233890 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2gbk\" (UniqueName: \"kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.233968 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234006 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234049 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234087 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234183 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234214 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234292 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234313 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234551 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.234756 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.265356 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=7.934560241 podStartE2EDuration="24.265337671s" podCreationTimestamp="2026-01-22 09:13:13 +0000 UTC" firstStartedPulling="2026-01-22 09:13:13.677452999 +0000 UTC m=+584.503363504" lastFinishedPulling="2026-01-22 09:13:30.008230429 +0000 UTC m=+600.834140934" observedRunningTime="2026-01-22 09:13:37.264322503 +0000 UTC m=+608.090233008" watchObservedRunningTime="2026-01-22 09:13:37.265337671 +0000 UTC m=+608.091248176" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335721 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335769 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335790 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335810 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335831 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335853 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335874 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335905 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335928 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335957 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335975 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2gbk\" (UniqueName: \"kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.335995 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.336010 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.336376 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.337204 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.337275 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.337299 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.337414 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.337815 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.338117 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.338142 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.338246 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.343824 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.345787 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.355171 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.370706 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2gbk\" (UniqueName: \"kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk\") pod \"service-telemetry-framework-index-1-build\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.423099 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:13:37 crc kubenswrapper[4681]: I0122 09:13:37.707898 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:13:38 crc kubenswrapper[4681]: I0122 09:13:38.205115 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"9f1e3bb8-dc23-4285-8005-ea48fa984a38","Type":"ContainerStarted","Data":"104cc0663ad65ec78983df045a4e0405947bd46108b089dd79fb42dfc76c3223"} Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.618869 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fvp42"] Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.642040 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.648427 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-882tm" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.652050 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fvp42"] Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.669962 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vchs\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-kube-api-access-7vchs\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.670023 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.772635 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vchs\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-kube-api-access-7vchs\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.772694 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.812208 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.813605 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vchs\" (UniqueName: \"kubernetes.io/projected/9d8dafc2-baca-45a7-bd2a-731f4011097e-kube-api-access-7vchs\") pod \"cert-manager-webhook-f4fb5df64-fvp42\" (UID: \"9d8dafc2-baca-45a7-bd2a-731f4011097e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:39 crc kubenswrapper[4681]: I0122 09:13:39.974821 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:40 crc kubenswrapper[4681]: I0122 09:13:40.310326 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fvp42"] Jan 22 09:13:40 crc kubenswrapper[4681]: W0122 09:13:40.325692 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d8dafc2_baca_45a7_bd2a_731f4011097e.slice/crio-009ed4a7d8f4fefa0386170689717e6e5245afd9d48c37e1dd9db59b633bec68 WatchSource:0}: Error finding container 009ed4a7d8f4fefa0386170689717e6e5245afd9d48c37e1dd9db59b633bec68: Status 404 returned error can't find the container with id 009ed4a7d8f4fefa0386170689717e6e5245afd9d48c37e1dd9db59b633bec68 Jan 22 09:13:41 crc kubenswrapper[4681]: I0122 09:13:41.226611 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" event={"ID":"9d8dafc2-baca-45a7-bd2a-731f4011097e","Type":"ContainerStarted","Data":"009ed4a7d8f4fefa0386170689717e6e5245afd9d48c37e1dd9db59b633bec68"} Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.454554 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-kfsd2"] Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.455932 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.458911 4681 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ks7f8" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.465229 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-kfsd2"] Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.497050 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-bound-sa-token\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.497149 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntpp5\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-kube-api-access-ntpp5\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.598287 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntpp5\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-kube-api-access-ntpp5\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.598350 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-bound-sa-token\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.620814 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-bound-sa-token\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.624552 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntpp5\" (UniqueName: \"kubernetes.io/projected/f293d0c8-2cee-4754-9edc-67524679619a-kube-api-access-ntpp5\") pod \"cert-manager-86cb77c54b-kfsd2\" (UID: \"f293d0c8-2cee-4754-9edc-67524679619a\") " pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:46 crc kubenswrapper[4681]: I0122 09:13:46.771224 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-kfsd2" Jan 22 09:13:47 crc kubenswrapper[4681]: I0122 09:13:47.068851 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-kfsd2"] Jan 22 09:13:47 crc kubenswrapper[4681]: I0122 09:13:47.279585 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-kfsd2" event={"ID":"f293d0c8-2cee-4754-9edc-67524679619a","Type":"ContainerStarted","Data":"e98976ee04ea46eb951faf854f1dd9343b070f9e0863e46b9456757ff93dbf10"} Jan 22 09:13:48 crc kubenswrapper[4681]: I0122 09:13:48.517528 4681 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="44b13aea-bee1-4576-87b4-d41165fcc2fa" containerName="elasticsearch" probeResult="failure" output=< Jan 22 09:13:48 crc kubenswrapper[4681]: {"timestamp": "2026-01-22T09:13:48+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 22 09:13:48 crc kubenswrapper[4681]: > Jan 22 09:13:53 crc kubenswrapper[4681]: I0122 09:13:53.860286 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 22 09:13:53 crc kubenswrapper[4681]: E0122 09:13:53.871912 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe" Jan 22 09:13:53 crc kubenswrapper[4681]: E0122 09:13:53.872644 4681 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 22 09:13:53 crc kubenswrapper[4681]: init container &Container{Name:git-clone,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe,Command:[],Args:[openshift-git-clone --v=0],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:BUILD,Value:{"kind":"Build","apiVersion":"build.openshift.io/v1","metadata":{"name":"service-telemetry-framework-index-1","namespace":"service-telemetry","uid":"abf5c907-e9cb-40d8-927b-4d2955e2936f","resourceVersion":"33474","generation":1,"creationTimestamp":"2026-01-22T09:13:37Z","labels":{"build":"service-telemetry-framework-index","buildconfig":"service-telemetry-framework-index","openshift.io/build-config.name":"service-telemetry-framework-index","openshift.io/build.start-policy":"Serial"},"annotations":{"openshift.io/build-config.name":"service-telemetry-framework-index","openshift.io/build.number":"1"},"ownerReferences":[{"apiVersion":"build.openshift.io/v1","kind":"BuildConfig","name":"service-telemetry-framework-index","uid":"05945b41-dd5e-4d68-b5ed-53db288c5093","controller":true}],"managedFields":[{"manager":"openshift-apiserver","operation":"Update","apiVersion":"build.openshift.io/v1","time":"2026-01-22T09:13:37Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.number":{}},"f:labels":{".":{},"f:build":{},"f:buildconfig":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.start-policy":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"05945b41-dd5e-4d68-b5ed-53db288c5093\"}":{}}},"f:spec":{"f:output":{"f:to":{}},"f:revision":{".":{},"f:git":{".":{},"f:author":{".":{},"f:email":{},"f:name":{}},"f:commit":{},"f:committer":{".":{},"f:email":{},"f:name":{}},"f:message":{}},"f:type":{}},"f:serviceAccount":{},"f:source":{"f:binary":{},"f:dockerfile":{},"f:type":{}},"f:strategy":{"f:dockerStrategy":{".":{},"f:from":{},"f:volumes":{".":{},"k:{\"name\":\"pull-secret\"}":{".":{},"f:mounts":{".":{},"k:{\"destinationPath\":\"/opt/app-root/auth\"}":{".":{},"f:destinationPath":{}}},"f:name":{},"f:source":{".":{},"f:secret":{".":{},"f:defaultMode":{},"f:secretName":{}},"f:type":{}}}}},"f:type":{}}},"f:status":{"f:conditions":{".":{},"k:{\"type\":\"New\"}":{".":{},"f:lastTransitionTime":{},"f:lastUpdateTime":{},"f:status":{},"f:type":{}}},"f:config":{},"f:phase":{}}}}]},"spec":{"serviceAccount":"builder","source":{"type":"Binary","binary":{},"dockerfile":"# The base image is expected to contain\n# /bin/opm (with a serve subcommand) and /bin/grpc_health_probe\n\nFROM quay.io/openshift/origin-operator-registry:4.13\n\nCOPY --chmod=666 index.yaml /configs/\n\nRUN mkdir /tmp/auth/\n# we need the contents of the mounted build volume from secret placed into config.json\nRUN cp /opt/app-root/auth/.dockerconfigjson /tmp/auth/config.json\nRUN DOCKER_CONFIG=/tmp/auth /bin/opm --skip-tls-verify render image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-operator-bundle:nightly-head image-registry.openshift-image-registry.svc:5000/service-telemetry/smart-gateway-operator-bundle:nightly-head --output=yaml \u003e\u003e /configs/index.yaml\n\nENTRYPOINT [\"/bin/opm\"]\nCMD [\"serve\", \"/configs\"]\n# Set DC-specific label for the location of the DC root directory\n# in the image\nLABEL operators.operatorframework.io.index.configs.v1=/configs\n"},"revision":{"type":"Git","git":{"commit":"61feee2ec412990f7c9b28eaf8191038759ea504","author":{"name":"Victoria Martinez de la Cruz","email":"victoria@redhat.com"},"committer":{"name":"GitHub","email":"noreply@github.com"},"message":"Update ansible-lint pinned version (#688) (#691)\n\n* Update ansible-lint pinned version (#688)\n\n* Update ansible-lint pinned version\n\n Bump ansible-lint to 25.2.0\n\n* Update ansible-lint-ignore with new linting issues\n\nansible-lint 25.2.0 introduces new linting rules that\nwe are not following. Add them to the ignore list.\n\n* Pin to ansible 12.0.0\n\n(cherry picked from commit 9be283230c8041ea41367857f7d49a2bafa19bef)\n\n* Update .ansible-lint-ignore to match issues in stable-1.5"}},"strategy":{"type":"Docker","dockerStrategy":{"from":{"kind":"DockerImage","name":"quay.io/openshift/origin-operator-registry@sha256:3f7a5a6e548e23a777e21f467552523aae63a157da293498ac1921c3be8c9f8a"},"pullSecret":{"name":"builder-dockercfg-fs2cb"},"volumes":[{"name":"pull-secret","source":{"type":"Secret","secret":{"secretName":"service-telemetry-framework-index-dockercfg","defaultMode":420}},"mounts":[{"destinationPath":"/opt/app-root/auth"}]}]}},"output":{"to":{"kind":"DockerImage","name":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest"},"pushSecret":{"name":"builder-dockercfg-fs2cb"}},"resources":{},"postCommit":{},"nodeSelector":null},"status":{"phase":"New","outputDockerImageReference":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest","config":{"kind":"BuildConfig","namespace":"service-telemetry","name":"service-telemetry-framework-index"},"output":{},"conditions":[{"type":"New","status":"True","lastUpdateTime":"2026-01-22T09:13:36Z","lastTransitionTime":"2026-01-22T09:13:36Z"}]}} Jan 22 09:13:53 crc kubenswrapper[4681]: ,ValueFrom:nil,},EnvVar{Name:LANG,Value:C.utf8,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/registries.conf,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_DIR_PATH,Value:/var/run/configs/openshift.io/build-system/registries.d,ValueFrom:nil,},EnvVar{Name:BUILD_SIGNATURE_POLICY_PATH,Value:/var/run/configs/openshift.io/build-system/policy.json,ValueFrom:nil,},EnvVar{Name:BUILD_STORAGE_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/storage.conf,ValueFrom:nil,},EnvVar{Name:BUILD_BLOBCACHE_DIR,Value:/var/cache/blobs,ValueFrom:nil,},EnvVar{Name:HTTP_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:http_proxy,Value:,ValueFrom:nil,},EnvVar{Name:HTTPS_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:https_proxy,Value:,ValueFrom:nil,},EnvVar{Name:NO_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:no_proxy,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:buildworkdir,ReadOnly:false,MountPath:/tmp/build,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-system-configs,ReadOnly:true,MountPath:/var/run/configs/openshift.io/build-system,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-proxy-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-blob-cache,ReadOnly:false,MountPath:/var/cache/blobs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2gbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[CHOWN DAC_OVERRIDE],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:true,StdinOnce:true,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-framework-index-1-build_service-telemetry(9f1e3bb8-dc23-4285-8005-ea48fa984a38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Jan 22 09:13:53 crc kubenswrapper[4681]: > logger="UnhandledError" Jan 22 09:13:53 crc kubenswrapper[4681]: E0122 09:13:53.873985 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"git-clone\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-framework-index-1-build" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" Jan 22 09:13:54 crc kubenswrapper[4681]: E0122 09:13:54.338581 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"git-clone\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe\\\"\"" pod="service-telemetry/service-telemetry-framework-index-1-build" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.369661 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-kfsd2" event={"ID":"f293d0c8-2cee-4754-9edc-67524679619a","Type":"ContainerStarted","Data":"9a682f87bc875d4b4702521f50c0afbe5f24285ac0a54929ff6d5149df70c484"} Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.373154 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" event={"ID":"9d8dafc2-baca-45a7-bd2a-731f4011097e","Type":"ContainerStarted","Data":"49b26d811dcba645c1c33e04fbc7785a9f8f27719b0e5b5dd59be73d9b16935f"} Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.374311 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.377478 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" event={"ID":"2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2","Type":"ContainerStarted","Data":"fe4b823aaa91a596f5e467420b8f5de8588fb09a1ed62c36239acea78936c641"} Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.401095 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-kfsd2" podStartSLOduration=2.280176466 podStartE2EDuration="13.40105043s" podCreationTimestamp="2026-01-22 09:13:46 +0000 UTC" firstStartedPulling="2026-01-22 09:13:47.090608594 +0000 UTC m=+617.916519099" lastFinishedPulling="2026-01-22 09:13:58.211482558 +0000 UTC m=+629.037393063" observedRunningTime="2026-01-22 09:13:59.394588047 +0000 UTC m=+630.220498552" watchObservedRunningTime="2026-01-22 09:13:59.40105043 +0000 UTC m=+630.226960975" Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.422012 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" podStartSLOduration=2.530562083 podStartE2EDuration="20.42199295s" podCreationTimestamp="2026-01-22 09:13:39 +0000 UTC" firstStartedPulling="2026-01-22 09:13:40.3416672 +0000 UTC m=+611.167577695" lastFinishedPulling="2026-01-22 09:13:58.233098057 +0000 UTC m=+629.059008562" observedRunningTime="2026-01-22 09:13:59.420753847 +0000 UTC m=+630.246664382" watchObservedRunningTime="2026-01-22 09:13:59.42199295 +0000 UTC m=+630.247903455" Jan 22 09:13:59 crc kubenswrapper[4681]: I0122 09:13:59.445308 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p7hrl" podStartSLOduration=2.1716437219999998 podStartE2EDuration="24.445286694s" podCreationTimestamp="2026-01-22 09:13:35 +0000 UTC" firstStartedPulling="2026-01-22 09:13:35.967401808 +0000 UTC m=+606.793312313" lastFinishedPulling="2026-01-22 09:13:58.24104478 +0000 UTC m=+629.066955285" observedRunningTime="2026-01-22 09:13:59.442236693 +0000 UTC m=+630.268147198" watchObservedRunningTime="2026-01-22 09:13:59.445286694 +0000 UTC m=+630.271197209" Jan 22 09:14:04 crc kubenswrapper[4681]: I0122 09:14:04.978469 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-fvp42" Jan 22 09:14:08 crc kubenswrapper[4681]: I0122 09:14:08.459321 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"9f1e3bb8-dc23-4285-8005-ea48fa984a38","Type":"ContainerStarted","Data":"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4"} Jan 22 09:14:08 crc kubenswrapper[4681]: E0122 09:14:08.546867 4681 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=8939313707385252646, SKID=, AKID=FE:41:6F:79:BC:CF:6E:79:91:57:B9:9C:C9:B0:48:5E:4F:8A:38:4A failed: x509: certificate signed by unknown authority" Jan 22 09:14:09 crc kubenswrapper[4681]: I0122 09:14:09.582491 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:14:09 crc kubenswrapper[4681]: I0122 09:14:09.582839 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-index-1-build" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" containerName="git-clone" containerID="cri-o://b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4" gracePeriod=30 Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.034787 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_9f1e3bb8-dc23-4285-8005-ea48fa984a38/git-clone/0.log" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.035153 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.079507 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.080614 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.081093 4681 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.181890 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.181969 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182014 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182040 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182086 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182107 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182132 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182162 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182189 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182334 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182390 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2gbk\" (UniqueName: \"kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182418 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir\") pod \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\" (UID: \"9f1e3bb8-dc23-4285-8005-ea48fa984a38\") " Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182358 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182828 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182864 4681 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.182966 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.183032 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.183059 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.183598 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.184794 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.186030 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.189647 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push" (OuterVolumeSpecName: "builder-dockercfg-fs2cb-push") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "builder-dockercfg-fs2cb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.190644 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.192206 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk" (OuterVolumeSpecName: "kube-api-access-j2gbk") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "kube-api-access-j2gbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.192385 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull" (OuterVolumeSpecName: "builder-dockercfg-fs2cb-pull") pod "9f1e3bb8-dc23-4285-8005-ea48fa984a38" (UID: "9f1e3bb8-dc23-4285-8005-ea48fa984a38"). InnerVolumeSpecName "builder-dockercfg-fs2cb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284184 4681 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284239 4681 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284285 4681 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284310 4681 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-push\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284333 4681 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284470 4681 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284494 4681 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/9f1e3bb8-dc23-4285-8005-ea48fa984a38-builder-dockercfg-fs2cb-pull\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284513 4681 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284531 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2gbk\" (UniqueName: \"kubernetes.io/projected/9f1e3bb8-dc23-4285-8005-ea48fa984a38-kube-api-access-j2gbk\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284549 4681 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f1e3bb8-dc23-4285-8005-ea48fa984a38-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.284567 4681 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f1e3bb8-dc23-4285-8005-ea48fa984a38-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492145 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_9f1e3bb8-dc23-4285-8005-ea48fa984a38/git-clone/0.log" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492227 4681 generic.go:334] "Generic (PLEG): container finished" podID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" containerID="b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4" exitCode=1 Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492375 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"9f1e3bb8-dc23-4285-8005-ea48fa984a38","Type":"ContainerDied","Data":"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4"} Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492433 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492468 4681 scope.go:117] "RemoveContainer" containerID="b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.492447 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"9f1e3bb8-dc23-4285-8005-ea48fa984a38","Type":"ContainerDied","Data":"104cc0663ad65ec78983df045a4e0405947bd46108b089dd79fb42dfc76c3223"} Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.527835 4681 scope.go:117] "RemoveContainer" containerID="b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4" Jan 22 09:14:10 crc kubenswrapper[4681]: E0122 09:14:10.529442 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4\": container with ID starting with b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4 not found: ID does not exist" containerID="b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.529535 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4"} err="failed to get container status \"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4\": rpc error: code = NotFound desc = could not find container \"b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4\": container with ID starting with b68241aeedcd54cab16de5252e2b574b507132d614d29c34824c6a35c5f008f4 not found: ID does not exist" Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.548613 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:14:10 crc kubenswrapper[4681]: I0122 09:14:10.557382 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 22 09:14:11 crc kubenswrapper[4681]: I0122 09:14:11.465954 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" path="/var/lib/kubelet/pods/9f1e3bb8-dc23-4285-8005-ea48fa984a38/volumes" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.063206 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Jan 22 09:14:21 crc kubenswrapper[4681]: E0122 09:14:21.063997 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" containerName="git-clone" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.064012 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" containerName="git-clone" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.064154 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1e3bb8-dc23-4285-8005-ea48fa984a38" containerName="git-clone" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.065033 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.067639 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-global-ca" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.068015 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-fs2cb" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.068407 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-sys-config" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.080355 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.080555 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-2-ca" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.100363 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.144776 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbkc\" (UniqueName: \"kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.144836 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.144965 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145113 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145150 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145214 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145366 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145413 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145475 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145544 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145573 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.145598 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246731 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246809 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246868 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246905 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246947 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.246979 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247058 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247109 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247150 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247183 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247225 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jbkc\" (UniqueName: \"kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247314 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247362 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247600 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247670 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.247746 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.248316 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.248336 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.248382 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.248798 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.249041 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.249503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.257410 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.257636 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.258567 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.280942 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jbkc\" (UniqueName: \"kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc\") pod \"service-telemetry-framework-index-2-build\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.396690 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:14:21 crc kubenswrapper[4681]: I0122 09:14:21.926728 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-2-build"] Jan 22 09:14:22 crc kubenswrapper[4681]: I0122 09:14:22.600458 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerStarted","Data":"6dd766df65bd6732abe7335548cc35888e3473f8c920807b3d902baabc395084"} Jan 22 09:14:23 crc kubenswrapper[4681]: I0122 09:14:23.105359 4681 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 22 09:14:23 crc kubenswrapper[4681]: I0122 09:14:23.609147 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerStarted","Data":"0b34143011c062eaa33d51a58939b3d639d9fe2fdba2b7dcdc393bab47985e3c"} Jan 22 09:14:24 crc kubenswrapper[4681]: I0122 09:14:24.617010 4681 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerID="0b34143011c062eaa33d51a58939b3d639d9fe2fdba2b7dcdc393bab47985e3c" exitCode=0 Jan 22 09:14:24 crc kubenswrapper[4681]: I0122 09:14:24.617047 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerDied","Data":"0b34143011c062eaa33d51a58939b3d639d9fe2fdba2b7dcdc393bab47985e3c"} Jan 22 09:14:25 crc kubenswrapper[4681]: I0122 09:14:25.625847 4681 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerID="8b60eee54e1fac03770f732bd7652c2146d2d380f74c9444cc0d226b48d19ffd" exitCode=0 Jan 22 09:14:25 crc kubenswrapper[4681]: I0122 09:14:25.626068 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerDied","Data":"8b60eee54e1fac03770f732bd7652c2146d2d380f74c9444cc0d226b48d19ffd"} Jan 22 09:14:25 crc kubenswrapper[4681]: I0122 09:14:25.698221 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_1e6f6f66-599e-48a5-b4b3-e9eaf71500e6/manage-dockerfile/0.log" Jan 22 09:14:26 crc kubenswrapper[4681]: I0122 09:14:26.642002 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerStarted","Data":"8ce9e06efabcecaafcb8e9f0713de4aecc6d0b9837c9ce8faf197a7b783e3b16"} Jan 22 09:14:26 crc kubenswrapper[4681]: I0122 09:14:26.692549 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-2-build" podStartSLOduration=6.692523659 podStartE2EDuration="6.692523659s" podCreationTimestamp="2026-01-22 09:14:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:14:26.682502383 +0000 UTC m=+657.508412938" watchObservedRunningTime="2026-01-22 09:14:26.692523659 +0000 UTC m=+657.518434194" Jan 22 09:14:30 crc kubenswrapper[4681]: I0122 09:14:30.106150 4681 scope.go:117] "RemoveContainer" containerID="e21c61baaab33aeacadf349c125a6f3f3d923c5ac12d42ee6b26b6e9cdc7b2aa" Jan 22 09:14:30 crc kubenswrapper[4681]: I0122 09:14:30.121976 4681 scope.go:117] "RemoveContainer" containerID="c2e7f95f66ccd1ee1a7321b3e6c94ba1337f60ef235958530fac73db2b9815b9" Jan 22 09:14:30 crc kubenswrapper[4681]: I0122 09:14:30.143044 4681 scope.go:117] "RemoveContainer" containerID="8626f50a80c0476f044b7bc5f0b4744af38b9d914f1c0f072024d8aaa282eee6" Jan 22 09:14:56 crc kubenswrapper[4681]: I0122 09:14:56.031058 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:14:56 crc kubenswrapper[4681]: I0122 09:14:56.033898 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.161346 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66"] Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.162250 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.164525 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.165048 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.173808 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66"] Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.235614 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl8sl\" (UniqueName: \"kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.235705 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.235758 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.337321 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl8sl\" (UniqueName: \"kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.337690 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.337838 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.339029 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.344512 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.366752 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl8sl\" (UniqueName: \"kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl\") pod \"collect-profiles-29484555-wfh66\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.487735 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.765043 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66"] Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.913543 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" event={"ID":"ed18b43b-6da8-420f-86b0-2a3ec92f60ab","Type":"ContainerStarted","Data":"bb4c3e0d3b68af662f1157338e9ba37923156bac785060429df0ad07d300114a"} Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.913602 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" event={"ID":"ed18b43b-6da8-420f-86b0-2a3ec92f60ab","Type":"ContainerStarted","Data":"f9633fb80e7591c8aaea1d96e57cf8a789f6a865e8e58229c7703b7a5f7d56a0"} Jan 22 09:15:00 crc kubenswrapper[4681]: I0122 09:15:00.935890 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" podStartSLOduration=0.935869086 podStartE2EDuration="935.869086ms" podCreationTimestamp="2026-01-22 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:15:00.933896484 +0000 UTC m=+691.759807059" watchObservedRunningTime="2026-01-22 09:15:00.935869086 +0000 UTC m=+691.761779621" Jan 22 09:15:01 crc kubenswrapper[4681]: I0122 09:15:01.920190 4681 generic.go:334] "Generic (PLEG): container finished" podID="ed18b43b-6da8-420f-86b0-2a3ec92f60ab" containerID="bb4c3e0d3b68af662f1157338e9ba37923156bac785060429df0ad07d300114a" exitCode=0 Jan 22 09:15:01 crc kubenswrapper[4681]: I0122 09:15:01.920253 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" event={"ID":"ed18b43b-6da8-420f-86b0-2a3ec92f60ab","Type":"ContainerDied","Data":"bb4c3e0d3b68af662f1157338e9ba37923156bac785060429df0ad07d300114a"} Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.253285 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.383102 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume\") pod \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.383151 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl8sl\" (UniqueName: \"kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl\") pod \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.383195 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume\") pod \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\" (UID: \"ed18b43b-6da8-420f-86b0-2a3ec92f60ab\") " Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.384203 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed18b43b-6da8-420f-86b0-2a3ec92f60ab" (UID: "ed18b43b-6da8-420f-86b0-2a3ec92f60ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.389466 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl" (OuterVolumeSpecName: "kube-api-access-kl8sl") pod "ed18b43b-6da8-420f-86b0-2a3ec92f60ab" (UID: "ed18b43b-6da8-420f-86b0-2a3ec92f60ab"). InnerVolumeSpecName "kube-api-access-kl8sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.389767 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed18b43b-6da8-420f-86b0-2a3ec92f60ab" (UID: "ed18b43b-6da8-420f-86b0-2a3ec92f60ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.485248 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.485341 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl8sl\" (UniqueName: \"kubernetes.io/projected/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-kube-api-access-kl8sl\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.485368 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed18b43b-6da8-420f-86b0-2a3ec92f60ab-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.941468 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" event={"ID":"ed18b43b-6da8-420f-86b0-2a3ec92f60ab","Type":"ContainerDied","Data":"f9633fb80e7591c8aaea1d96e57cf8a789f6a865e8e58229c7703b7a5f7d56a0"} Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.941525 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9633fb80e7591c8aaea1d96e57cf8a789f6a865e8e58229c7703b7a5f7d56a0" Jan 22 09:15:03 crc kubenswrapper[4681]: I0122 09:15:03.941600 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66" Jan 22 09:15:10 crc kubenswrapper[4681]: I0122 09:15:10.998657 4681 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerID="8ce9e06efabcecaafcb8e9f0713de4aecc6d0b9837c9ce8faf197a7b783e3b16" exitCode=0 Jan 22 09:15:11 crc kubenswrapper[4681]: I0122 09:15:10.998729 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerDied","Data":"8ce9e06efabcecaafcb8e9f0713de4aecc6d0b9837c9ce8faf197a7b783e3b16"} Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.321424 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.421731 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422179 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422374 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422430 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422562 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422628 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422727 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jbkc\" (UniqueName: \"kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422777 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422831 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422876 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.422940 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.423001 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.423064 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.423115 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles\") pod \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\" (UID: \"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6\") " Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.423529 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.423601 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.424747 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425082 4681 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425110 4681 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425399 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425123 4681 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425477 4681 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.425807 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.426045 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.428413 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc" (OuterVolumeSpecName: "kube-api-access-5jbkc") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "kube-api-access-5jbkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.430469 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull" (OuterVolumeSpecName: "builder-dockercfg-fs2cb-pull") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "builder-dockercfg-fs2cb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.431654 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.439475 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push" (OuterVolumeSpecName: "builder-dockercfg-fs2cb-push") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "builder-dockercfg-fs2cb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527403 4681 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527463 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jbkc\" (UniqueName: \"kubernetes.io/projected/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-kube-api-access-5jbkc\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527491 4681 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs2cb-pull\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-pull\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527516 4681 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527538 4681 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527562 4681 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.527583 4681 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-fs2cb-push\" (UniqueName: \"kubernetes.io/secret/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-builder-dockercfg-fs2cb-push\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.647581 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:12 crc kubenswrapper[4681]: I0122 09:15:12.730192 4681 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.020064 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-2-build" event={"ID":"1e6f6f66-599e-48a5-b4b3-e9eaf71500e6","Type":"ContainerDied","Data":"6dd766df65bd6732abe7335548cc35888e3473f8c920807b3d902baabc395084"} Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.020100 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd766df65bd6732abe7335548cc35888e3473f8c920807b3d902baabc395084" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.020252 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-2-build" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.285830 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:13 crc kubenswrapper[4681]: E0122 09:15:13.286168 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="docker-build" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286189 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="docker-build" Jan 22 09:15:13 crc kubenswrapper[4681]: E0122 09:15:13.286224 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="git-clone" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286238 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="git-clone" Jan 22 09:15:13 crc kubenswrapper[4681]: E0122 09:15:13.286256 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed18b43b-6da8-420f-86b0-2a3ec92f60ab" containerName="collect-profiles" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286297 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed18b43b-6da8-420f-86b0-2a3ec92f60ab" containerName="collect-profiles" Jan 22 09:15:13 crc kubenswrapper[4681]: E0122 09:15:13.286313 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="manage-dockerfile" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286325 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="manage-dockerfile" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286509 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" containerName="docker-build" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.286534 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed18b43b-6da8-420f-86b0-2a3ec92f60ab" containerName="collect-profiles" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.287160 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.291071 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-operators-dockercfg-l7c6x" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.311310 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.340227 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgcg\" (UniqueName: \"kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg\") pod \"service-telemetry-framework-operators-gcj6w\" (UID: \"71b3fa7d-31ad-4d9d-874a-6174b84238da\") " pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.442196 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgcg\" (UniqueName: \"kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg\") pod \"service-telemetry-framework-operators-gcj6w\" (UID: \"71b3fa7d-31ad-4d9d-874a-6174b84238da\") " pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.480347 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgcg\" (UniqueName: \"kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg\") pod \"service-telemetry-framework-operators-gcj6w\" (UID: \"71b3fa7d-31ad-4d9d-874a-6174b84238da\") " pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.611019 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.896590 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:13 crc kubenswrapper[4681]: W0122 09:15:13.905293 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b3fa7d_31ad_4d9d_874a_6174b84238da.slice/crio-e485d4140f7c8b3185a8726555fc2e7c26f5b9517ffaca04b7c9645dcb2e26b0 WatchSource:0}: Error finding container e485d4140f7c8b3185a8726555fc2e7c26f5b9517ffaca04b7c9645dcb2e26b0: Status 404 returned error can't find the container with id e485d4140f7c8b3185a8726555fc2e7c26f5b9517ffaca04b7c9645dcb2e26b0 Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.943367 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6" (UID: "1e6f6f66-599e-48a5-b4b3-e9eaf71500e6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:13 crc kubenswrapper[4681]: I0122 09:15:13.952161 4681 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f66-599e-48a5-b4b3-e9eaf71500e6-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:14 crc kubenswrapper[4681]: I0122 09:15:14.028742 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" event={"ID":"71b3fa7d-31ad-4d9d-874a-6174b84238da","Type":"ContainerStarted","Data":"e485d4140f7c8b3185a8726555fc2e7c26f5b9517ffaca04b7c9645dcb2e26b0"} Jan 22 09:15:17 crc kubenswrapper[4681]: I0122 09:15:17.651473 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.470475 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-b9xkl"] Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.471990 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.477574 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-b9xkl"] Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.515841 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkxql\" (UniqueName: \"kubernetes.io/projected/c1271182-7432-477c-9e01-0b0b6d850377-kube-api-access-mkxql\") pod \"service-telemetry-framework-operators-b9xkl\" (UID: \"c1271182-7432-477c-9e01-0b0b6d850377\") " pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.616347 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkxql\" (UniqueName: \"kubernetes.io/projected/c1271182-7432-477c-9e01-0b0b6d850377-kube-api-access-mkxql\") pod \"service-telemetry-framework-operators-b9xkl\" (UID: \"c1271182-7432-477c-9e01-0b0b6d850377\") " pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.641539 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkxql\" (UniqueName: \"kubernetes.io/projected/c1271182-7432-477c-9e01-0b0b6d850377-kube-api-access-mkxql\") pod \"service-telemetry-framework-operators-b9xkl\" (UID: \"c1271182-7432-477c-9e01-0b0b6d850377\") " pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:18 crc kubenswrapper[4681]: I0122 09:15:18.837906 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:25 crc kubenswrapper[4681]: I0122 09:15:25.986940 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-b9xkl"] Jan 22 09:15:25 crc kubenswrapper[4681]: W0122 09:15:25.994901 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1271182_7432_477c_9e01_0b0b6d850377.slice/crio-469a5c3ac3822c462add43e19a787659529b240b25acdb6e6ed08e89c52e7c2a WatchSource:0}: Error finding container 469a5c3ac3822c462add43e19a787659529b240b25acdb6e6ed08e89c52e7c2a: Status 404 returned error can't find the container with id 469a5c3ac3822c462add43e19a787659529b240b25acdb6e6ed08e89c52e7c2a Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.030698 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.030766 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.115800 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" event={"ID":"71b3fa7d-31ad-4d9d-874a-6174b84238da","Type":"ContainerStarted","Data":"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21"} Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.115963 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" podUID="71b3fa7d-31ad-4d9d-874a-6174b84238da" containerName="registry-server" containerID="cri-o://222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21" gracePeriod=2 Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.117543 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" event={"ID":"c1271182-7432-477c-9e01-0b0b6d850377","Type":"ContainerStarted","Data":"469a5c3ac3822c462add43e19a787659529b240b25acdb6e6ed08e89c52e7c2a"} Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.137045 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" podStartSLOduration=1.247598056 podStartE2EDuration="13.137015539s" podCreationTimestamp="2026-01-22 09:15:13 +0000 UTC" firstStartedPulling="2026-01-22 09:15:13.90777361 +0000 UTC m=+704.733684135" lastFinishedPulling="2026-01-22 09:15:25.797191103 +0000 UTC m=+716.623101618" observedRunningTime="2026-01-22 09:15:26.130985189 +0000 UTC m=+716.956895704" watchObservedRunningTime="2026-01-22 09:15:26.137015539 +0000 UTC m=+716.962926074" Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.616533 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.731417 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pgcg\" (UniqueName: \"kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg\") pod \"71b3fa7d-31ad-4d9d-874a-6174b84238da\" (UID: \"71b3fa7d-31ad-4d9d-874a-6174b84238da\") " Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.739015 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg" (OuterVolumeSpecName: "kube-api-access-8pgcg") pod "71b3fa7d-31ad-4d9d-874a-6174b84238da" (UID: "71b3fa7d-31ad-4d9d-874a-6174b84238da"). InnerVolumeSpecName "kube-api-access-8pgcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:26 crc kubenswrapper[4681]: I0122 09:15:26.833622 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pgcg\" (UniqueName: \"kubernetes.io/projected/71b3fa7d-31ad-4d9d-874a-6174b84238da-kube-api-access-8pgcg\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.128183 4681 generic.go:334] "Generic (PLEG): container finished" podID="71b3fa7d-31ad-4d9d-874a-6174b84238da" containerID="222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21" exitCode=0 Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.128340 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.128381 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" event={"ID":"71b3fa7d-31ad-4d9d-874a-6174b84238da","Type":"ContainerDied","Data":"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21"} Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.128449 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-gcj6w" event={"ID":"71b3fa7d-31ad-4d9d-874a-6174b84238da","Type":"ContainerDied","Data":"e485d4140f7c8b3185a8726555fc2e7c26f5b9517ffaca04b7c9645dcb2e26b0"} Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.128479 4681 scope.go:117] "RemoveContainer" containerID="222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.132138 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" event={"ID":"c1271182-7432-477c-9e01-0b0b6d850377","Type":"ContainerStarted","Data":"54b2abd1de4f35251b570c8324ed962b3306136fa6a1c97492283502fabd5e6a"} Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.154532 4681 scope.go:117] "RemoveContainer" containerID="222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21" Jan 22 09:15:27 crc kubenswrapper[4681]: E0122 09:15:27.155500 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21\": container with ID starting with 222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21 not found: ID does not exist" containerID="222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.155578 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21"} err="failed to get container status \"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21\": rpc error: code = NotFound desc = could not find container \"222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21\": container with ID starting with 222a5b806d39dd713b2bb00401508b51cfac353716ded866513c81a8a46ddb21 not found: ID does not exist" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.170495 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" podStartSLOduration=9.028687321 podStartE2EDuration="9.170464293s" podCreationTimestamp="2026-01-22 09:15:18 +0000 UTC" firstStartedPulling="2026-01-22 09:15:26.003563188 +0000 UTC m=+716.829473733" lastFinishedPulling="2026-01-22 09:15:26.14534017 +0000 UTC m=+716.971250705" observedRunningTime="2026-01-22 09:15:27.157643913 +0000 UTC m=+717.983554508" watchObservedRunningTime="2026-01-22 09:15:27.170464293 +0000 UTC m=+717.996374828" Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.193974 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.201994 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-gcj6w"] Jan 22 09:15:27 crc kubenswrapper[4681]: I0122 09:15:27.465415 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b3fa7d-31ad-4d9d-874a-6174b84238da" path="/var/lib/kubelet/pods/71b3fa7d-31ad-4d9d-874a-6174b84238da/volumes" Jan 22 09:15:28 crc kubenswrapper[4681]: I0122 09:15:28.838350 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:28 crc kubenswrapper[4681]: I0122 09:15:28.838802 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:28 crc kubenswrapper[4681]: I0122 09:15:28.878442 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:38 crc kubenswrapper[4681]: I0122 09:15:38.868753 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-b9xkl" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.861643 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j"] Jan 22 09:15:50 crc kubenswrapper[4681]: E0122 09:15:50.862562 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b3fa7d-31ad-4d9d-874a-6174b84238da" containerName="registry-server" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.862581 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b3fa7d-31ad-4d9d-874a-6174b84238da" containerName="registry-server" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.862752 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b3fa7d-31ad-4d9d-874a-6174b84238da" containerName="registry-server" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.864054 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.873376 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j"] Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.939473 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.939552 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:50 crc kubenswrapper[4681]: I0122 09:15:50.939593 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gqj\" (UniqueName: \"kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.040885 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.040998 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.041061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gqj\" (UniqueName: \"kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.041561 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.041660 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.070621 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gqj\" (UniqueName: \"kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj\") pod \"cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.180131 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.433239 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j"] Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.652386 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm"] Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.654073 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.659878 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.666037 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm"] Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.753245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4khg\" (UniqueName: \"kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.753334 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.753448 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.854785 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4khg\" (UniqueName: \"kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.854855 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.854899 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.855731 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.855744 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.888934 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4khg\" (UniqueName: \"kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:51 crc kubenswrapper[4681]: I0122 09:15:51.982622 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.222484 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm"] Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.342959 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerStarted","Data":"49dce9abbadeec4e0fcae3bddaa708a67a254d240db0a0fdc3059ecba592c1ce"} Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.343855 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" event={"ID":"910fbf37-770d-44c7-812d-805b7097b592","Type":"ContainerStarted","Data":"abf8052a23b194f026cb495052a98a52245bd61413cdb283f0c79f332e89c399"} Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.664080 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd"] Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.665551 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.681842 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd"] Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.770282 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc7x\" (UniqueName: \"kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.770386 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.770436 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.871340 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.871392 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.871459 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc7x\" (UniqueName: \"kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.871802 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.871849 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:52 crc kubenswrapper[4681]: I0122 09:15:52.891829 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc7x\" (UniqueName: \"kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x\") pod \"27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:53 crc kubenswrapper[4681]: I0122 09:15:53.016893 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:53 crc kubenswrapper[4681]: I0122 09:15:53.241187 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd"] Jan 22 09:15:53 crc kubenswrapper[4681]: W0122 09:15:53.246127 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689f04b5_a34d_4a7a_ad94_5ec5e77a8371.slice/crio-89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034 WatchSource:0}: Error finding container 89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034: Status 404 returned error can't find the container with id 89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034 Jan 22 09:15:53 crc kubenswrapper[4681]: I0122 09:15:53.351760 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerStarted","Data":"89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034"} Jan 22 09:15:54 crc kubenswrapper[4681]: I0122 09:15:54.361160 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerStarted","Data":"734669e299eebe486124214a9af16064ce588c40179a6909a7d7a2736cf55df6"} Jan 22 09:15:54 crc kubenswrapper[4681]: I0122 09:15:54.365275 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerStarted","Data":"f12541cf5643247528a41047d96abf89ff6b0876c75e738b0074cfe117656c15"} Jan 22 09:15:54 crc kubenswrapper[4681]: I0122 09:15:54.373141 4681 generic.go:334] "Generic (PLEG): container finished" podID="910fbf37-770d-44c7-812d-805b7097b592" containerID="fcf9aca213e8ded36b611520b275f9368e33c418e335c56c98ec9b8bc5fcc3cb" exitCode=0 Jan 22 09:15:54 crc kubenswrapper[4681]: I0122 09:15:54.373196 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" event={"ID":"910fbf37-770d-44c7-812d-805b7097b592","Type":"ContainerDied","Data":"fcf9aca213e8ded36b611520b275f9368e33c418e335c56c98ec9b8bc5fcc3cb"} Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.383454 4681 generic.go:334] "Generic (PLEG): container finished" podID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerID="734669e299eebe486124214a9af16064ce588c40179a6909a7d7a2736cf55df6" exitCode=0 Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.383560 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerDied","Data":"734669e299eebe486124214a9af16064ce588c40179a6909a7d7a2736cf55df6"} Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.396629 4681 generic.go:334] "Generic (PLEG): container finished" podID="11634169-0b47-4aa5-90d3-6038111be8f6" containerID="f12541cf5643247528a41047d96abf89ff6b0876c75e738b0074cfe117656c15" exitCode=0 Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.396757 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerDied","Data":"f12541cf5643247528a41047d96abf89ff6b0876c75e738b0074cfe117656c15"} Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.400604 4681 generic.go:334] "Generic (PLEG): container finished" podID="910fbf37-770d-44c7-812d-805b7097b592" containerID="1214c85ef36403abe8dda943ecf21c75b79b810bfa7d5201afbfb9067c54f86a" exitCode=0 Jan 22 09:15:55 crc kubenswrapper[4681]: I0122 09:15:55.400660 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" event={"ID":"910fbf37-770d-44c7-812d-805b7097b592","Type":"ContainerDied","Data":"1214c85ef36403abe8dda943ecf21c75b79b810bfa7d5201afbfb9067c54f86a"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.031144 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.031664 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.031743 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.032941 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.033239 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c" gracePeriod=600 Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.411288 4681 generic.go:334] "Generic (PLEG): container finished" podID="910fbf37-770d-44c7-812d-805b7097b592" containerID="dc4e1e6e28e4f3705197c505250ff53ba4d0f638a709c6d20283a1b44eff06cc" exitCode=0 Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.411338 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" event={"ID":"910fbf37-770d-44c7-812d-805b7097b592","Type":"ContainerDied","Data":"dc4e1e6e28e4f3705197c505250ff53ba4d0f638a709c6d20283a1b44eff06cc"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.415154 4681 generic.go:334] "Generic (PLEG): container finished" podID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerID="15b95e18f6e47200184746719b31d71b4abd57b866693fc3fbb3fb3f59219d27" exitCode=0 Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.415227 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerDied","Data":"15b95e18f6e47200184746719b31d71b4abd57b866693fc3fbb3fb3f59219d27"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.421936 4681 generic.go:334] "Generic (PLEG): container finished" podID="11634169-0b47-4aa5-90d3-6038111be8f6" containerID="648737d25f09826e9f0c8faa0844f3010b9ff53ab1dc1b1115c1af61ee711eab" exitCode=0 Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.421997 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerDied","Data":"648737d25f09826e9f0c8faa0844f3010b9ff53ab1dc1b1115c1af61ee711eab"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.426610 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c" exitCode=0 Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.426651 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.426675 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4"} Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.426696 4681 scope.go:117] "RemoveContainer" containerID="39698b3d5d144b43917440c8cecc471264f2f7dcffc44d6bfe898d27e9d76dce" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.606025 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.607540 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.615707 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.730910 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb6bq\" (UniqueName: \"kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.730989 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.731022 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.832132 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.832772 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb6bq\" (UniqueName: \"kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.832849 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.833643 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.833876 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.916057 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb6bq\" (UniqueName: \"kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq\") pod \"redhat-operators-hsfkk\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:56 crc kubenswrapper[4681]: I0122 09:15:56.933926 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.130230 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.436646 4681 generic.go:334] "Generic (PLEG): container finished" podID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerID="745664c81c115fb201ffe3394d68a7312db541d820a3a338b11126bd94fd43b6" exitCode=0 Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.436754 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerDied","Data":"745664c81c115fb201ffe3394d68a7312db541d820a3a338b11126bd94fd43b6"} Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.439965 4681 generic.go:334] "Generic (PLEG): container finished" podID="11634169-0b47-4aa5-90d3-6038111be8f6" containerID="e068a3accaab21040abc9dd73780d930aa07265cfd3be03eba24276c2c6a161e" exitCode=0 Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.440092 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerDied","Data":"e068a3accaab21040abc9dd73780d930aa07265cfd3be03eba24276c2c6a161e"} Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.444610 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerStarted","Data":"d3f092ed68d7eda334215098df4f4f41d3229a47badd8ba5308cccf6b7c71b77"} Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.671993 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.744980 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gqj\" (UniqueName: \"kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj\") pod \"910fbf37-770d-44c7-812d-805b7097b592\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.745166 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle\") pod \"910fbf37-770d-44c7-812d-805b7097b592\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.745235 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util\") pod \"910fbf37-770d-44c7-812d-805b7097b592\" (UID: \"910fbf37-770d-44c7-812d-805b7097b592\") " Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.746863 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle" (OuterVolumeSpecName: "bundle") pod "910fbf37-770d-44c7-812d-805b7097b592" (UID: "910fbf37-770d-44c7-812d-805b7097b592"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.750667 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj" (OuterVolumeSpecName: "kube-api-access-m7gqj") pod "910fbf37-770d-44c7-812d-805b7097b592" (UID: "910fbf37-770d-44c7-812d-805b7097b592"). InnerVolumeSpecName "kube-api-access-m7gqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.765145 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util" (OuterVolumeSpecName: "util") pod "910fbf37-770d-44c7-812d-805b7097b592" (UID: "910fbf37-770d-44c7-812d-805b7097b592"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.846544 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gqj\" (UniqueName: \"kubernetes.io/projected/910fbf37-770d-44c7-812d-805b7097b592-kube-api-access-m7gqj\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.846590 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:57 crc kubenswrapper[4681]: I0122 09:15:57.846603 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/910fbf37-770d-44c7-812d-805b7097b592-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.453728 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" event={"ID":"910fbf37-770d-44c7-812d-805b7097b592","Type":"ContainerDied","Data":"abf8052a23b194f026cb495052a98a52245bd61413cdb283f0c79f332e89c399"} Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.454038 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf8052a23b194f026cb495052a98a52245bd61413cdb283f0c79f332e89c399" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.453881 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.709547 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.772388 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle\") pod \"11634169-0b47-4aa5-90d3-6038111be8f6\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.772619 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util\") pod \"11634169-0b47-4aa5-90d3-6038111be8f6\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.772694 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4khg\" (UniqueName: \"kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg\") pod \"11634169-0b47-4aa5-90d3-6038111be8f6\" (UID: \"11634169-0b47-4aa5-90d3-6038111be8f6\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.773208 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle" (OuterVolumeSpecName: "bundle") pod "11634169-0b47-4aa5-90d3-6038111be8f6" (UID: "11634169-0b47-4aa5-90d3-6038111be8f6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.780396 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg" (OuterVolumeSpecName: "kube-api-access-s4khg") pod "11634169-0b47-4aa5-90d3-6038111be8f6" (UID: "11634169-0b47-4aa5-90d3-6038111be8f6"). InnerVolumeSpecName "kube-api-access-s4khg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.788234 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.793752 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util" (OuterVolumeSpecName: "util") pod "11634169-0b47-4aa5-90d3-6038111be8f6" (UID: "11634169-0b47-4aa5-90d3-6038111be8f6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.873548 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util\") pod \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.873653 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle\") pod \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.873720 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kc7x\" (UniqueName: \"kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x\") pod \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\" (UID: \"689f04b5-a34d-4a7a-ad94-5ec5e77a8371\") " Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.874240 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.874311 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4khg\" (UniqueName: \"kubernetes.io/projected/11634169-0b47-4aa5-90d3-6038111be8f6-kube-api-access-s4khg\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.874334 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11634169-0b47-4aa5-90d3-6038111be8f6-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.876434 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle" (OuterVolumeSpecName: "bundle") pod "689f04b5-a34d-4a7a-ad94-5ec5e77a8371" (UID: "689f04b5-a34d-4a7a-ad94-5ec5e77a8371"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.878883 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x" (OuterVolumeSpecName: "kube-api-access-4kc7x") pod "689f04b5-a34d-4a7a-ad94-5ec5e77a8371" (UID: "689f04b5-a34d-4a7a-ad94-5ec5e77a8371"). InnerVolumeSpecName "kube-api-access-4kc7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.895134 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util" (OuterVolumeSpecName: "util") pod "689f04b5-a34d-4a7a-ad94-5ec5e77a8371" (UID: "689f04b5-a34d-4a7a-ad94-5ec5e77a8371"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.975693 4681 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-util\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.975731 4681 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-bundle\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:58 crc kubenswrapper[4681]: I0122 09:15:58.975744 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kc7x\" (UniqueName: \"kubernetes.io/projected/689f04b5-a34d-4a7a-ad94-5ec5e77a8371-kube-api-access-4kc7x\") on node \"crc\" DevicePath \"\"" Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.470440 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" event={"ID":"689f04b5-a34d-4a7a-ad94-5ec5e77a8371","Type":"ContainerDied","Data":"89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034"} Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.470813 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89cde8ae6f9e1db50ad86bb443d9eead0321589c9d1a5cec039624339810c034" Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.470892 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd" Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.480842 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" event={"ID":"11634169-0b47-4aa5-90d3-6038111be8f6","Type":"ContainerDied","Data":"49dce9abbadeec4e0fcae3bddaa708a67a254d240db0a0fdc3059ecba592c1ce"} Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.480878 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49dce9abbadeec4e0fcae3bddaa708a67a254d240db0a0fdc3059ecba592c1ce" Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.480958 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm" Jan 22 09:15:59 crc kubenswrapper[4681]: I0122 09:15:59.485111 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerStarted","Data":"9fe9603fe143ad9a219aa8ee8fb1f0badefdb5985b28822da47a4f7b06e49202"} Jan 22 09:16:00 crc kubenswrapper[4681]: I0122 09:16:00.496953 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb386249-9f93-4235-a26e-77274da23692" containerID="9fe9603fe143ad9a219aa8ee8fb1f0badefdb5985b28822da47a4f7b06e49202" exitCode=0 Jan 22 09:16:00 crc kubenswrapper[4681]: I0122 09:16:00.497058 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerDied","Data":"9fe9603fe143ad9a219aa8ee8fb1f0badefdb5985b28822da47a4f7b06e49202"} Jan 22 09:16:02 crc kubenswrapper[4681]: I0122 09:16:02.513200 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb386249-9f93-4235-a26e-77274da23692" containerID="f0ece9aba3c46c3cb12d6ef02ab1e12b56353b932d627a49a10a9eb7c9760862" exitCode=0 Jan 22 09:16:02 crc kubenswrapper[4681]: I0122 09:16:02.513449 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerDied","Data":"f0ece9aba3c46c3cb12d6ef02ab1e12b56353b932d627a49a10a9eb7c9760862"} Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075227 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-748rx"] Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075672 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075684 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075694 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075701 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075709 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075715 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075724 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075729 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075737 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075745 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075758 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075764 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075772 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075778 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075787 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075793 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="pull" Jan 22 09:16:03 crc kubenswrapper[4681]: E0122 09:16:03.075803 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075808 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="util" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075894 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="910fbf37-770d-44c7-812d-805b7097b592" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075904 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="689f04b5-a34d-4a7a-ad94-5ec5e77a8371" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.075913 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="11634169-0b47-4aa5-90d3-6038111be8f6" containerName="extract" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.076296 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.082298 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-95f85" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.085475 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-748rx"] Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.132973 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3f0e9240-f934-448c-8027-b5d44f6ca38c-runner\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.133027 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6pz\" (UniqueName: \"kubernetes.io/projected/3f0e9240-f934-448c-8027-b5d44f6ca38c-kube-api-access-8t6pz\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.234355 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3f0e9240-f934-448c-8027-b5d44f6ca38c-runner\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.234410 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6pz\" (UniqueName: \"kubernetes.io/projected/3f0e9240-f934-448c-8027-b5d44f6ca38c-kube-api-access-8t6pz\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.235237 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/3f0e9240-f934-448c-8027-b5d44f6ca38c-runner\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.267526 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6pz\" (UniqueName: \"kubernetes.io/projected/3f0e9240-f934-448c-8027-b5d44f6ca38c-kube-api-access-8t6pz\") pod \"service-telemetry-operator-55b89ddfb9-748rx\" (UID: \"3f0e9240-f934-448c-8027-b5d44f6ca38c\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.388764 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.554598 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerStarted","Data":"9c26183d45cb6fc956058f0314caa667d8e7c5823d0b23260310d7cc42a10ed7"} Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.578245 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hsfkk" podStartSLOduration=5.154455255 podStartE2EDuration="7.578226291s" podCreationTimestamp="2026-01-22 09:15:56 +0000 UTC" firstStartedPulling="2026-01-22 09:16:00.500295057 +0000 UTC m=+751.326205582" lastFinishedPulling="2026-01-22 09:16:02.924066103 +0000 UTC m=+753.749976618" observedRunningTime="2026-01-22 09:16:03.570836325 +0000 UTC m=+754.396746840" watchObservedRunningTime="2026-01-22 09:16:03.578226291 +0000 UTC m=+754.404136796" Jan 22 09:16:03 crc kubenswrapper[4681]: I0122 09:16:03.679426 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-748rx"] Jan 22 09:16:04 crc kubenswrapper[4681]: I0122 09:16:04.566658 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" event={"ID":"3f0e9240-f934-448c-8027-b5d44f6ca38c","Type":"ContainerStarted","Data":"87edbfe551c3ee07bc00c79fa41fc766f0f0b857a01b24c6d6b324ac737056a6"} Jan 22 09:16:04 crc kubenswrapper[4681]: I0122 09:16:04.969034 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-859fl"] Jan 22 09:16:04 crc kubenswrapper[4681]: I0122 09:16:04.969714 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:04 crc kubenswrapper[4681]: I0122 09:16:04.972243 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-782kx" Jan 22 09:16:04 crc kubenswrapper[4681]: I0122 09:16:04.990942 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-859fl"] Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.070235 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/91a6ad3b-58ad-4095-97df-be878b439ac6-runner\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.070351 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w97rp\" (UniqueName: \"kubernetes.io/projected/91a6ad3b-58ad-4095-97df-be878b439ac6-kube-api-access-w97rp\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.171256 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w97rp\" (UniqueName: \"kubernetes.io/projected/91a6ad3b-58ad-4095-97df-be878b439ac6-kube-api-access-w97rp\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.171337 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/91a6ad3b-58ad-4095-97df-be878b439ac6-runner\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.171875 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/91a6ad3b-58ad-4095-97df-be878b439ac6-runner\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.198435 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w97rp\" (UniqueName: \"kubernetes.io/projected/91a6ad3b-58ad-4095-97df-be878b439ac6-kube-api-access-w97rp\") pod \"smart-gateway-operator-bbbc889bc-859fl\" (UID: \"91a6ad3b-58ad-4095-97df-be878b439ac6\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.299588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.531095 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-859fl"] Jan 22 09:16:05 crc kubenswrapper[4681]: W0122 09:16:05.540077 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a6ad3b_58ad_4095_97df_be878b439ac6.slice/crio-943d5160620efaf84bf7c0d212ab15db49b5a9e2136508036f7be1a603fcfc9f WatchSource:0}: Error finding container 943d5160620efaf84bf7c0d212ab15db49b5a9e2136508036f7be1a603fcfc9f: Status 404 returned error can't find the container with id 943d5160620efaf84bf7c0d212ab15db49b5a9e2136508036f7be1a603fcfc9f Jan 22 09:16:05 crc kubenswrapper[4681]: I0122 09:16:05.582576 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" event={"ID":"91a6ad3b-58ad-4095-97df-be878b439ac6","Type":"ContainerStarted","Data":"943d5160620efaf84bf7c0d212ab15db49b5a9e2136508036f7be1a603fcfc9f"} Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.310235 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-z6l9z"] Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.311430 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.313203 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-k8kfw" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.320559 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-z6l9z"] Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.385301 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6z7\" (UniqueName: \"kubernetes.io/projected/6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686-kube-api-access-8p6z7\") pod \"interconnect-operator-5bb49f789d-z6l9z\" (UID: \"6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686\") " pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.486413 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6z7\" (UniqueName: \"kubernetes.io/projected/6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686-kube-api-access-8p6z7\") pod \"interconnect-operator-5bb49f789d-z6l9z\" (UID: \"6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686\") " pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.516803 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6z7\" (UniqueName: \"kubernetes.io/projected/6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686-kube-api-access-8p6z7\") pod \"interconnect-operator-5bb49f789d-z6l9z\" (UID: \"6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686\") " pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.663229 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.935105 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:06 crc kubenswrapper[4681]: I0122 09:16:06.935479 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:07 crc kubenswrapper[4681]: I0122 09:16:07.067724 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-z6l9z"] Jan 22 09:16:07 crc kubenswrapper[4681]: W0122 09:16:07.080155 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9c13a0_8fe9_486f_a6f5_0bdec2ab2686.slice/crio-3c22d6193026cfa07dde4346f3aafa6415c8bde3121a6d2326c1af859a9cc7b0 WatchSource:0}: Error finding container 3c22d6193026cfa07dde4346f3aafa6415c8bde3121a6d2326c1af859a9cc7b0: Status 404 returned error can't find the container with id 3c22d6193026cfa07dde4346f3aafa6415c8bde3121a6d2326c1af859a9cc7b0 Jan 22 09:16:07 crc kubenswrapper[4681]: I0122 09:16:07.604304 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" event={"ID":"6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686","Type":"ContainerStarted","Data":"3c22d6193026cfa07dde4346f3aafa6415c8bde3121a6d2326c1af859a9cc7b0"} Jan 22 09:16:07 crc kubenswrapper[4681]: I0122 09:16:07.978172 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hsfkk" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="registry-server" probeResult="failure" output=< Jan 22 09:16:07 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:16:07 crc kubenswrapper[4681]: > Jan 22 09:16:16 crc kubenswrapper[4681]: I0122 09:16:16.990841 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:17 crc kubenswrapper[4681]: I0122 09:16:17.039433 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:20 crc kubenswrapper[4681]: I0122 09:16:20.400950 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:16:20 crc kubenswrapper[4681]: I0122 09:16:20.401343 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hsfkk" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="registry-server" containerID="cri-o://9c26183d45cb6fc956058f0314caa667d8e7c5823d0b23260310d7cc42a10ed7" gracePeriod=2 Jan 22 09:16:21 crc kubenswrapper[4681]: I0122 09:16:21.720045 4681 generic.go:334] "Generic (PLEG): container finished" podID="fb386249-9f93-4235-a26e-77274da23692" containerID="9c26183d45cb6fc956058f0314caa667d8e7c5823d0b23260310d7cc42a10ed7" exitCode=0 Jan 22 09:16:21 crc kubenswrapper[4681]: I0122 09:16:21.720118 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerDied","Data":"9c26183d45cb6fc956058f0314caa667d8e7c5823d0b23260310d7cc42a10ed7"} Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.402392 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.441179 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content\") pod \"fb386249-9f93-4235-a26e-77274da23692\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.441231 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb6bq\" (UniqueName: \"kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq\") pod \"fb386249-9f93-4235-a26e-77274da23692\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.441254 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities\") pod \"fb386249-9f93-4235-a26e-77274da23692\" (UID: \"fb386249-9f93-4235-a26e-77274da23692\") " Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.442351 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities" (OuterVolumeSpecName: "utilities") pod "fb386249-9f93-4235-a26e-77274da23692" (UID: "fb386249-9f93-4235-a26e-77274da23692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.448309 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq" (OuterVolumeSpecName: "kube-api-access-jb6bq") pod "fb386249-9f93-4235-a26e-77274da23692" (UID: "fb386249-9f93-4235-a26e-77274da23692"). InnerVolumeSpecName "kube-api-access-jb6bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.542950 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb6bq\" (UniqueName: \"kubernetes.io/projected/fb386249-9f93-4235-a26e-77274da23692-kube-api-access-jb6bq\") on node \"crc\" DevicePath \"\"" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.542985 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.557851 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb386249-9f93-4235-a26e-77274da23692" (UID: "fb386249-9f93-4235-a26e-77274da23692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.643534 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb386249-9f93-4235-a26e-77274da23692-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.744789 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsfkk" event={"ID":"fb386249-9f93-4235-a26e-77274da23692","Type":"ContainerDied","Data":"d3f092ed68d7eda334215098df4f4f41d3229a47badd8ba5308cccf6b7c71b77"} Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.744839 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsfkk" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.744845 4681 scope.go:117] "RemoveContainer" containerID="9c26183d45cb6fc956058f0314caa667d8e7c5823d0b23260310d7cc42a10ed7" Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.775400 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:16:24 crc kubenswrapper[4681]: I0122 09:16:24.781089 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hsfkk"] Jan 22 09:16:25 crc kubenswrapper[4681]: I0122 09:16:25.460447 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb386249-9f93-4235-a26e-77274da23692" path="/var/lib/kubelet/pods/fb386249-9f93-4235-a26e-77274da23692/volumes" Jan 22 09:16:31 crc kubenswrapper[4681]: I0122 09:16:31.096141 4681 scope.go:117] "RemoveContainer" containerID="f0ece9aba3c46c3cb12d6ef02ab1e12b56353b932d627a49a10a9eb7c9760862" Jan 22 09:16:31 crc kubenswrapper[4681]: I0122 09:16:31.650380 4681 scope.go:117] "RemoveContainer" containerID="9fe9603fe143ad9a219aa8ee8fb1f0badefdb5985b28822da47a4f7b06e49202" Jan 22 09:16:31 crc kubenswrapper[4681]: E0122 09:16:31.947158 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Jan 22 09:16:31 crc kubenswrapper[4681]: E0122 09:16:31.947696 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:quay.io/infrawatch/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:quay.io/infrawatch/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1768085178,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w97rp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-bbbc889bc-859fl_service-telemetry(91a6ad3b-58ad-4095-97df-be878b439ac6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 09:16:31 crc kubenswrapper[4681]: E0122 09:16:31.948979 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" podUID="91a6ad3b-58ad-4095-97df-be878b439ac6" Jan 22 09:16:32 crc kubenswrapper[4681]: E0122 09:16:32.031195 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/service-telemetry-operator:latest" Jan 22 09:16:32 crc kubenswrapper[4681]: E0122 09:16:32.031421 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/service-telemetry-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:service-telemetry-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_WEBHOOK_SNMP_IMAGE,Value:quay.io/infrawatch/prometheus-webhook-snmp:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_IMAGE,Value:quay.io/prometheus/prometheus:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER_IMAGE,Value:quay.io/prometheus/alertmanager:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:service-telemetry-operator.v1.5.1768085182,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8t6pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-operator-55b89ddfb9-748rx_service-telemetry(3f0e9240-f934-448c-8027-b5d44f6ca38c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 09:16:32 crc kubenswrapper[4681]: E0122 09:16:32.032629 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" podUID="3f0e9240-f934-448c-8027-b5d44f6ca38c" Jan 22 09:16:32 crc kubenswrapper[4681]: I0122 09:16:32.804070 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" event={"ID":"6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686","Type":"ContainerStarted","Data":"caa9a8fdeaa23f01658185aaa8718bde98b38bd88d71ed7146ab30dab7c9a3e7"} Jan 22 09:16:32 crc kubenswrapper[4681]: E0122 09:16:32.806127 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" podUID="91a6ad3b-58ad-4095-97df-be878b439ac6" Jan 22 09:16:32 crc kubenswrapper[4681]: E0122 09:16:32.806134 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/service-telemetry-operator:latest\\\"\"" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" podUID="3f0e9240-f934-448c-8027-b5d44f6ca38c" Jan 22 09:16:32 crc kubenswrapper[4681]: I0122 09:16:32.848005 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-z6l9z" podStartSLOduration=8.768314119 podStartE2EDuration="26.847980761s" podCreationTimestamp="2026-01-22 09:16:06 +0000 UTC" firstStartedPulling="2026-01-22 09:16:07.08705186 +0000 UTC m=+757.912962365" lastFinishedPulling="2026-01-22 09:16:25.166718502 +0000 UTC m=+775.992629007" observedRunningTime="2026-01-22 09:16:32.842605599 +0000 UTC m=+783.668516134" watchObservedRunningTime="2026-01-22 09:16:32.847980761 +0000 UTC m=+783.673891306" Jan 22 09:16:45 crc kubenswrapper[4681]: I0122 09:16:45.905317 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" event={"ID":"91a6ad3b-58ad-4095-97df-be878b439ac6","Type":"ContainerStarted","Data":"1ad3df88611315f52802e3126a725593aa729797a3301e912d3c7c5a971194d7"} Jan 22 09:16:45 crc kubenswrapper[4681]: I0122 09:16:45.925318 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bbbc889bc-859fl" podStartSLOduration=2.073822515 podStartE2EDuration="41.925301343s" podCreationTimestamp="2026-01-22 09:16:04 +0000 UTC" firstStartedPulling="2026-01-22 09:16:05.548511813 +0000 UTC m=+756.374422318" lastFinishedPulling="2026-01-22 09:16:45.399990641 +0000 UTC m=+796.225901146" observedRunningTime="2026-01-22 09:16:45.924321667 +0000 UTC m=+796.750232172" watchObservedRunningTime="2026-01-22 09:16:45.925301343 +0000 UTC m=+796.751211848" Jan 22 09:16:49 crc kubenswrapper[4681]: I0122 09:16:49.935191 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" event={"ID":"3f0e9240-f934-448c-8027-b5d44f6ca38c","Type":"ContainerStarted","Data":"951493c3dce8fcf10ee6423f85eb5894dd0f030f9af7675c882fee439f69b436"} Jan 22 09:16:49 crc kubenswrapper[4681]: I0122 09:16:49.957517 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-748rx" podStartSLOduration=1.258472153 podStartE2EDuration="46.95749171s" podCreationTimestamp="2026-01-22 09:16:03 +0000 UTC" firstStartedPulling="2026-01-22 09:16:03.690919222 +0000 UTC m=+754.516829727" lastFinishedPulling="2026-01-22 09:16:49.389938739 +0000 UTC m=+800.215849284" observedRunningTime="2026-01-22 09:16:49.953130084 +0000 UTC m=+800.779040609" watchObservedRunningTime="2026-01-22 09:16:49.95749171 +0000 UTC m=+800.783402255" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.022788 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:13 crc kubenswrapper[4681]: E0122 09:17:13.024033 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="registry-server" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.024062 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="registry-server" Jan 22 09:17:13 crc kubenswrapper[4681]: E0122 09:17:13.024098 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="extract-utilities" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.024115 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="extract-utilities" Jan 22 09:17:13 crc kubenswrapper[4681]: E0122 09:17:13.024160 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="extract-content" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.024178 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="extract-content" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.024442 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb386249-9f93-4235-a26e-77274da23692" containerName="registry-server" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.025902 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.043648 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.091905 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q26qh\" (UniqueName: \"kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.091975 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.092147 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.192936 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.193462 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q26qh\" (UniqueName: \"kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.193548 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.194132 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.194466 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.221147 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q26qh\" (UniqueName: \"kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh\") pod \"certified-operators-hqjbq\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.365040 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:13 crc kubenswrapper[4681]: I0122 09:17:13.614702 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:14 crc kubenswrapper[4681]: I0122 09:17:14.117657 4681 generic.go:334] "Generic (PLEG): container finished" podID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerID="e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef" exitCode=0 Jan 22 09:17:14 crc kubenswrapper[4681]: I0122 09:17:14.117713 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerDied","Data":"e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef"} Jan 22 09:17:14 crc kubenswrapper[4681]: I0122 09:17:14.117983 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerStarted","Data":"a2cee82b11f375573a976bc1c4f3301af66a3549c134800b1e613895b8e0424a"} Jan 22 09:17:16 crc kubenswrapper[4681]: I0122 09:17:16.138149 4681 generic.go:334] "Generic (PLEG): container finished" podID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerID="34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405" exitCode=0 Jan 22 09:17:16 crc kubenswrapper[4681]: I0122 09:17:16.138323 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerDied","Data":"34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405"} Jan 22 09:17:17 crc kubenswrapper[4681]: I0122 09:17:17.155179 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerStarted","Data":"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649"} Jan 22 09:17:17 crc kubenswrapper[4681]: I0122 09:17:17.185737 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqjbq" podStartSLOduration=2.770835342 podStartE2EDuration="5.185713027s" podCreationTimestamp="2026-01-22 09:17:12 +0000 UTC" firstStartedPulling="2026-01-22 09:17:14.120309654 +0000 UTC m=+824.946220159" lastFinishedPulling="2026-01-22 09:17:16.535187309 +0000 UTC m=+827.361097844" observedRunningTime="2026-01-22 09:17:17.184153385 +0000 UTC m=+828.010063930" watchObservedRunningTime="2026-01-22 09:17:17.185713027 +0000 UTC m=+828.011623572" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.532655 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.533827 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.538289 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.538508 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.538750 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.538964 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.539141 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.539296 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-qhl2k" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.543745 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.550038 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.672995 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbthx\" (UniqueName: \"kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673064 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673125 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673146 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673279 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673326 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.673399 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.774796 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbthx\" (UniqueName: \"kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775138 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775211 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775276 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775333 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775357 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.775390 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.777958 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.781409 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.782665 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.786712 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.789023 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.791613 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.804979 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbthx\" (UniqueName: \"kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx\") pod \"default-interconnect-68864d46cb-r5tvj\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:18 crc kubenswrapper[4681]: I0122 09:17:18.856833 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:17:19 crc kubenswrapper[4681]: I0122 09:17:19.319585 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:17:20 crc kubenswrapper[4681]: I0122 09:17:20.180671 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" event={"ID":"0f9dbef0-752d-4587-ae43-d5b405a24a7d","Type":"ContainerStarted","Data":"5e749288731eb69c1508916b718024efa5e213537c6ff70f0936d27968793352"} Jan 22 09:17:23 crc kubenswrapper[4681]: I0122 09:17:23.365831 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:23 crc kubenswrapper[4681]: I0122 09:17:23.366421 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:23 crc kubenswrapper[4681]: I0122 09:17:23.418574 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:24 crc kubenswrapper[4681]: I0122 09:17:24.264831 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:25 crc kubenswrapper[4681]: I0122 09:17:25.221934 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" event={"ID":"0f9dbef0-752d-4587-ae43-d5b405a24a7d","Type":"ContainerStarted","Data":"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b"} Jan 22 09:17:25 crc kubenswrapper[4681]: I0122 09:17:25.260816 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" podStartSLOduration=1.91052841 podStartE2EDuration="7.260788353s" podCreationTimestamp="2026-01-22 09:17:18 +0000 UTC" firstStartedPulling="2026-01-22 09:17:19.342030754 +0000 UTC m=+830.167941289" lastFinishedPulling="2026-01-22 09:17:24.692290727 +0000 UTC m=+835.518201232" observedRunningTime="2026-01-22 09:17:25.244739128 +0000 UTC m=+836.070649653" watchObservedRunningTime="2026-01-22 09:17:25.260788353 +0000 UTC m=+836.086698888" Jan 22 09:17:26 crc kubenswrapper[4681]: I0122 09:17:26.607913 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:26 crc kubenswrapper[4681]: I0122 09:17:26.608338 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqjbq" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="registry-server" containerID="cri-o://d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649" gracePeriod=2 Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.118851 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.217757 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q26qh\" (UniqueName: \"kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh\") pod \"c9483236-cb01-4121-81a2-7a34070aa3f7\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.217896 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content\") pod \"c9483236-cb01-4121-81a2-7a34070aa3f7\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.217935 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities\") pod \"c9483236-cb01-4121-81a2-7a34070aa3f7\" (UID: \"c9483236-cb01-4121-81a2-7a34070aa3f7\") " Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.219105 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities" (OuterVolumeSpecName: "utilities") pod "c9483236-cb01-4121-81a2-7a34070aa3f7" (UID: "c9483236-cb01-4121-81a2-7a34070aa3f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.229146 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh" (OuterVolumeSpecName: "kube-api-access-q26qh") pod "c9483236-cb01-4121-81a2-7a34070aa3f7" (UID: "c9483236-cb01-4121-81a2-7a34070aa3f7"). InnerVolumeSpecName "kube-api-access-q26qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.250023 4681 generic.go:334] "Generic (PLEG): container finished" podID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerID="d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649" exitCode=0 Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.250070 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerDied","Data":"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649"} Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.250092 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqjbq" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.250119 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqjbq" event={"ID":"c9483236-cb01-4121-81a2-7a34070aa3f7","Type":"ContainerDied","Data":"a2cee82b11f375573a976bc1c4f3301af66a3549c134800b1e613895b8e0424a"} Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.250143 4681 scope.go:117] "RemoveContainer" containerID="d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.279730 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9483236-cb01-4121-81a2-7a34070aa3f7" (UID: "c9483236-cb01-4121-81a2-7a34070aa3f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.285755 4681 scope.go:117] "RemoveContainer" containerID="34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.307913 4681 scope.go:117] "RemoveContainer" containerID="e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.320165 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.320204 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9483236-cb01-4121-81a2-7a34070aa3f7-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.320218 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q26qh\" (UniqueName: \"kubernetes.io/projected/c9483236-cb01-4121-81a2-7a34070aa3f7-kube-api-access-q26qh\") on node \"crc\" DevicePath \"\"" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.335339 4681 scope.go:117] "RemoveContainer" containerID="d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649" Jan 22 09:17:28 crc kubenswrapper[4681]: E0122 09:17:28.335873 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649\": container with ID starting with d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649 not found: ID does not exist" containerID="d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.335926 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649"} err="failed to get container status \"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649\": rpc error: code = NotFound desc = could not find container \"d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649\": container with ID starting with d38487a4658c66b8f19c3ab7a11931fa1c0f6435e0167e064993465106011649 not found: ID does not exist" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.335960 4681 scope.go:117] "RemoveContainer" containerID="34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405" Jan 22 09:17:28 crc kubenswrapper[4681]: E0122 09:17:28.336439 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405\": container with ID starting with 34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405 not found: ID does not exist" containerID="34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.336472 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405"} err="failed to get container status \"34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405\": rpc error: code = NotFound desc = could not find container \"34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405\": container with ID starting with 34d9ea52591e3fb4bb4f9c5f236d0733ea15815b7b08a0ccc45c90a1ad9d9405 not found: ID does not exist" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.336636 4681 scope.go:117] "RemoveContainer" containerID="e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef" Jan 22 09:17:28 crc kubenswrapper[4681]: E0122 09:17:28.336942 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef\": container with ID starting with e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef not found: ID does not exist" containerID="e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.336978 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef"} err="failed to get container status \"e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef\": rpc error: code = NotFound desc = could not find container \"e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef\": container with ID starting with e78b5c25fc84d102fe4f074075520a3ecc9fba333909b9f40e9a9478a75372ef not found: ID does not exist" Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.606645 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:28 crc kubenswrapper[4681]: I0122 09:17:28.613957 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqjbq"] Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.070852 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.071329 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="registry-server" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.071361 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="registry-server" Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.071381 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="extract-content" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.071393 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="extract-content" Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.071415 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="extract-utilities" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.071431 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="extract-utilities" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.071624 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" containerName="registry-server" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.073434 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.078478 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.078711 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.078867 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.080121 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.080173 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.081386 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.081629 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-fzhxx" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.082250 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.082407 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.082407 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.106086 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.131762 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132472 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qvw2\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-kube-api-access-2qvw2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132528 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132560 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/187498e5-8d96-4912-8ad6-5a87ddca4a88-config-out\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132601 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-tls-assets\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132640 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132744 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132946 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-web-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.132995 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.133071 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.133145 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.133254 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.234955 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-web-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235025 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235076 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235123 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235182 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235229 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235333 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qvw2\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-kube-api-access-2qvw2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235376 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235414 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/187498e5-8d96-4912-8ad6-5a87ddca4a88-config-out\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235452 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-tls-assets\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235493 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.235533 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.235707 4681 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.235801 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls podName:187498e5-8d96-4912-8ad6-5a87ddca4a88 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:29.735776224 +0000 UTC m=+840.561686769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "187498e5-8d96-4912-8ad6-5a87ddca4a88") : secret "default-prometheus-proxy-tls" not found Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.236845 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.236893 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.236943 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.238808 4681 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.238918 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af3fff23491ecc0ee20f2e0bd0f623c4635401f18ad7fac4002e7bceb5e54f2a/globalmount\"" pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.238940 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/187498e5-8d96-4912-8ad6-5a87ddca4a88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.244678 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/187498e5-8d96-4912-8ad6-5a87ddca4a88-config-out\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.245451 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.250081 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-tls-assets\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.260432 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-web-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.262982 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-config\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.267645 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qvw2\" (UniqueName: \"kubernetes.io/projected/187498e5-8d96-4912-8ad6-5a87ddca4a88-kube-api-access-2qvw2\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.287242 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3b55a100-f6dd-4cbc-a3ac-08debb49e582\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.469301 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9483236-cb01-4121-81a2-7a34070aa3f7" path="/var/lib/kubelet/pods/c9483236-cb01-4121-81a2-7a34070aa3f7/volumes" Jan 22 09:17:29 crc kubenswrapper[4681]: I0122 09:17:29.744025 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.744303 4681 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 22 09:17:29 crc kubenswrapper[4681]: E0122 09:17:29.744411 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls podName:187498e5-8d96-4912-8ad6-5a87ddca4a88 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:30.744382164 +0000 UTC m=+841.570292699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "187498e5-8d96-4912-8ad6-5a87ddca4a88") : secret "default-prometheus-proxy-tls" not found Jan 22 09:17:30 crc kubenswrapper[4681]: I0122 09:17:30.760101 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:30 crc kubenswrapper[4681]: I0122 09:17:30.767709 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/187498e5-8d96-4912-8ad6-5a87ddca4a88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"187498e5-8d96-4912-8ad6-5a87ddca4a88\") " pod="service-telemetry/prometheus-default-0" Jan 22 09:17:30 crc kubenswrapper[4681]: I0122 09:17:30.898995 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-fzhxx" Jan 22 09:17:30 crc kubenswrapper[4681]: I0122 09:17:30.907144 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 22 09:17:31 crc kubenswrapper[4681]: I0122 09:17:31.404346 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 22 09:17:31 crc kubenswrapper[4681]: W0122 09:17:31.417247 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187498e5_8d96_4912_8ad6_5a87ddca4a88.slice/crio-f51b5104376f23c78fa264721e87fdeca4d5bf3be68eeb5054806161076d5c64 WatchSource:0}: Error finding container f51b5104376f23c78fa264721e87fdeca4d5bf3be68eeb5054806161076d5c64: Status 404 returned error can't find the container with id f51b5104376f23c78fa264721e87fdeca4d5bf3be68eeb5054806161076d5c64 Jan 22 09:17:32 crc kubenswrapper[4681]: I0122 09:17:32.292523 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerStarted","Data":"f51b5104376f23c78fa264721e87fdeca4d5bf3be68eeb5054806161076d5c64"} Jan 22 09:17:38 crc kubenswrapper[4681]: I0122 09:17:38.336242 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerStarted","Data":"07143959b345eb4c48ffd80add184761336661d9b974e60f07c8c1b6c8efdbe0"} Jan 22 09:17:38 crc kubenswrapper[4681]: I0122 09:17:38.875831 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6"] Jan 22 09:17:38 crc kubenswrapper[4681]: I0122 09:17:38.877075 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" Jan 22 09:17:38 crc kubenswrapper[4681]: I0122 09:17:38.889550 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6"] Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:38.999824 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2cv\" (UniqueName: \"kubernetes.io/projected/3f65bf37-6f6e-4117-b952-d8d859f01094-kube-api-access-4b2cv\") pod \"default-snmp-webhook-78bcbbdcff-dh6q6\" (UID: \"3f65bf37-6f6e-4117-b952-d8d859f01094\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:39.101217 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2cv\" (UniqueName: \"kubernetes.io/projected/3f65bf37-6f6e-4117-b952-d8d859f01094-kube-api-access-4b2cv\") pod \"default-snmp-webhook-78bcbbdcff-dh6q6\" (UID: \"3f65bf37-6f6e-4117-b952-d8d859f01094\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:39.134660 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2cv\" (UniqueName: \"kubernetes.io/projected/3f65bf37-6f6e-4117-b952-d8d859f01094-kube-api-access-4b2cv\") pod \"default-snmp-webhook-78bcbbdcff-dh6q6\" (UID: \"3f65bf37-6f6e-4117-b952-d8d859f01094\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:39.193779 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:39.495173 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6"] Jan 22 09:17:39 crc kubenswrapper[4681]: W0122 09:17:39.503675 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f65bf37_6f6e_4117_b952_d8d859f01094.slice/crio-06c143baaf175f738aeb427b34beabcbc9c54ae006d12e5d56f26eb6e7c31719 WatchSource:0}: Error finding container 06c143baaf175f738aeb427b34beabcbc9c54ae006d12e5d56f26eb6e7c31719: Status 404 returned error can't find the container with id 06c143baaf175f738aeb427b34beabcbc9c54ae006d12e5d56f26eb6e7c31719 Jan 22 09:17:39 crc kubenswrapper[4681]: I0122 09:17:39.506521 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:17:40 crc kubenswrapper[4681]: I0122 09:17:40.354819 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" event={"ID":"3f65bf37-6f6e-4117-b952-d8d859f01094","Type":"ContainerStarted","Data":"06c143baaf175f738aeb427b34beabcbc9c54ae006d12e5d56f26eb6e7c31719"} Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.960188 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.961849 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.964603 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.964686 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.964735 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-rbm7n" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.964879 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.965038 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.965174 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Jan 22 09:17:42 crc kubenswrapper[4681]: I0122 09:17:42.982952 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.060657 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.060706 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.060741 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-config-volume\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.060774 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-web-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.060971 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef78e840-4520-4de1-8abe-af82e052bfa3-config-out\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.061028 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-tls-assets\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.061085 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mksb\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-kube-api-access-8mksb\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.061120 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.061150 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162643 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mksb\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-kube-api-access-8mksb\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162711 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162749 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162809 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162867 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162923 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-config-volume\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.162991 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-web-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.163060 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef78e840-4520-4de1-8abe-af82e052bfa3-config-out\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.163090 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-tls-assets\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: E0122 09:17:43.163103 4681 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:43 crc kubenswrapper[4681]: E0122 09:17:43.163198 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls podName:ef78e840-4520-4de1-8abe-af82e052bfa3 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:43.663172465 +0000 UTC m=+854.489083070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "ef78e840-4520-4de1-8abe-af82e052bfa3") : secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.168656 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef78e840-4520-4de1-8abe-af82e052bfa3-config-out\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.168700 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.168689 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-config-volume\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.169113 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-web-config\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.169462 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-tls-assets\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.171864 4681 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.171941 4681 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/abf336f49a9a8a33a72e57de67d95456e72cc245e366176d45e6da1eb185d557/globalmount\"" pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.173615 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.180074 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mksb\" (UniqueName: \"kubernetes.io/projected/ef78e840-4520-4de1-8abe-af82e052bfa3-kube-api-access-8mksb\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.208102 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f29cf97-ffde-43cb-814e-6b8ff6d7b41e\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: I0122 09:17:43.668974 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:43 crc kubenswrapper[4681]: E0122 09:17:43.669168 4681 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:43 crc kubenswrapper[4681]: E0122 09:17:43.669224 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls podName:ef78e840-4520-4de1-8abe-af82e052bfa3 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:44.669206646 +0000 UTC m=+855.495117151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "ef78e840-4520-4de1-8abe-af82e052bfa3") : secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:44 crc kubenswrapper[4681]: I0122 09:17:44.683983 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:44 crc kubenswrapper[4681]: E0122 09:17:44.684110 4681 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:44 crc kubenswrapper[4681]: E0122 09:17:44.684432 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls podName:ef78e840-4520-4de1-8abe-af82e052bfa3 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:46.684414821 +0000 UTC m=+857.510325326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "ef78e840-4520-4de1-8abe-af82e052bfa3") : secret "default-alertmanager-proxy-tls" not found Jan 22 09:17:45 crc kubenswrapper[4681]: I0122 09:17:45.408071 4681 generic.go:334] "Generic (PLEG): container finished" podID="187498e5-8d96-4912-8ad6-5a87ddca4a88" containerID="07143959b345eb4c48ffd80add184761336661d9b974e60f07c8c1b6c8efdbe0" exitCode=0 Jan 22 09:17:45 crc kubenswrapper[4681]: I0122 09:17:45.408132 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerDied","Data":"07143959b345eb4c48ffd80add184761336661d9b974e60f07c8c1b6c8efdbe0"} Jan 22 09:17:46 crc kubenswrapper[4681]: I0122 09:17:46.713821 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:46 crc kubenswrapper[4681]: I0122 09:17:46.724530 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef78e840-4520-4de1-8abe-af82e052bfa3-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"ef78e840-4520-4de1-8abe-af82e052bfa3\") " pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:46 crc kubenswrapper[4681]: I0122 09:17:46.954966 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 22 09:17:48 crc kubenswrapper[4681]: I0122 09:17:48.727974 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 22 09:17:48 crc kubenswrapper[4681]: W0122 09:17:48.753873 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef78e840_4520_4de1_8abe_af82e052bfa3.slice/crio-1c8d8122924e30d0770b2723f256557242d810ed6aa004dda264570b4c504a30 WatchSource:0}: Error finding container 1c8d8122924e30d0770b2723f256557242d810ed6aa004dda264570b4c504a30: Status 404 returned error can't find the container with id 1c8d8122924e30d0770b2723f256557242d810ed6aa004dda264570b4c504a30 Jan 22 09:17:49 crc kubenswrapper[4681]: I0122 09:17:49.436031 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" event={"ID":"3f65bf37-6f6e-4117-b952-d8d859f01094","Type":"ContainerStarted","Data":"e15698ddb10ee3aaa02b16e2b9e2e6eb9cb8b10d97160170b1be8d933d19f923"} Jan 22 09:17:49 crc kubenswrapper[4681]: I0122 09:17:49.437895 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerStarted","Data":"1c8d8122924e30d0770b2723f256557242d810ed6aa004dda264570b4c504a30"} Jan 22 09:17:49 crc kubenswrapper[4681]: I0122 09:17:49.449189 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-dh6q6" podStartSLOduration=2.635883007 podStartE2EDuration="11.449171739s" podCreationTimestamp="2026-01-22 09:17:38 +0000 UTC" firstStartedPulling="2026-01-22 09:17:39.506208787 +0000 UTC m=+850.332119302" lastFinishedPulling="2026-01-22 09:17:48.319497529 +0000 UTC m=+859.145408034" observedRunningTime="2026-01-22 09:17:49.447555186 +0000 UTC m=+860.273465681" watchObservedRunningTime="2026-01-22 09:17:49.449171739 +0000 UTC m=+860.275082244" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.458729 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerStarted","Data":"356f7439be91cad59d29eb1558fd86da95ecc287ebe932db4b3f3bd46a18cd78"} Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.608143 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.609858 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.618303 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.684300 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4ds\" (UniqueName: \"kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.684349 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.684535 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.785243 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4ds\" (UniqueName: \"kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.785369 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.785413 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.786091 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.786373 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.807998 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4ds\" (UniqueName: \"kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds\") pod \"community-operators-47zk5\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:51 crc kubenswrapper[4681]: I0122 09:17:51.946853 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:17:52 crc kubenswrapper[4681]: I0122 09:17:52.387558 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:17:52 crc kubenswrapper[4681]: W0122 09:17:52.390207 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97346e65_09fb_415b_a61c_34bbdb536f56.slice/crio-a42e25b3b8ae43ae24ed69d4491d34b6333becdddcb09f7793843f6da19f6619 WatchSource:0}: Error finding container a42e25b3b8ae43ae24ed69d4491d34b6333becdddcb09f7793843f6da19f6619: Status 404 returned error can't find the container with id a42e25b3b8ae43ae24ed69d4491d34b6333becdddcb09f7793843f6da19f6619 Jan 22 09:17:52 crc kubenswrapper[4681]: I0122 09:17:52.466454 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerStarted","Data":"daea5da91af35108f0f848f8e1ea814673c011f837ec7af7a7db9f56a7ce295d"} Jan 22 09:17:52 crc kubenswrapper[4681]: I0122 09:17:52.467539 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerStarted","Data":"a42e25b3b8ae43ae24ed69d4491d34b6333becdddcb09f7793843f6da19f6619"} Jan 22 09:17:53 crc kubenswrapper[4681]: I0122 09:17:53.477063 4681 generic.go:334] "Generic (PLEG): container finished" podID="97346e65-09fb-415b-a61c-34bbdb536f56" containerID="2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a" exitCode=0 Jan 22 09:17:53 crc kubenswrapper[4681]: I0122 09:17:53.477097 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerDied","Data":"2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a"} Jan 22 09:17:54 crc kubenswrapper[4681]: I0122 09:17:54.490874 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerStarted","Data":"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22"} Jan 22 09:17:54 crc kubenswrapper[4681]: I0122 09:17:54.496090 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerStarted","Data":"73f7189fa0cfa611cb52eb26c739482b3e4975b0fe7d614a0878219ab2eedcf5"} Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.387154 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr"] Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.388253 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.392462 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.392581 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.392633 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.392840 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-qcg4k" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.402503 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr"] Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.503981 4681 generic.go:334] "Generic (PLEG): container finished" podID="97346e65-09fb-415b-a61c-34bbdb536f56" containerID="d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22" exitCode=0 Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.504020 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerDied","Data":"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22"} Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.532980 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.533052 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54cxf\" (UniqueName: \"kubernetes.io/projected/f38c0f28-9c07-459b-89e4-0aa2c7847262-kube-api-access-54cxf\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.533075 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f38c0f28-9c07-459b-89e4-0aa2c7847262-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.533976 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f38c0f28-9c07-459b-89e4-0aa2c7847262-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.534030 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.635093 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54cxf\" (UniqueName: \"kubernetes.io/projected/f38c0f28-9c07-459b-89e4-0aa2c7847262-kube-api-access-54cxf\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.635134 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f38c0f28-9c07-459b-89e4-0aa2c7847262-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.635188 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f38c0f28-9c07-459b-89e4-0aa2c7847262-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.635212 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.635285 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: E0122 09:17:55.636432 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 22 09:17:55 crc kubenswrapper[4681]: E0122 09:17:55.636531 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls podName:f38c0f28-9c07-459b-89e4-0aa2c7847262 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:56.136511967 +0000 UTC m=+866.962422462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" (UID: "f38c0f28-9c07-459b-89e4-0aa2c7847262") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.636658 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f38c0f28-9c07-459b-89e4-0aa2c7847262-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.636927 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f38c0f28-9c07-459b-89e4-0aa2c7847262-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.655363 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:55 crc kubenswrapper[4681]: I0122 09:17:55.655555 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54cxf\" (UniqueName: \"kubernetes.io/projected/f38c0f28-9c07-459b-89e4-0aa2c7847262-kube-api-access-54cxf\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.054186 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.054530 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.154355 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:56 crc kubenswrapper[4681]: E0122 09:17:56.154587 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 22 09:17:56 crc kubenswrapper[4681]: E0122 09:17:56.154674 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls podName:f38c0f28-9c07-459b-89e4-0aa2c7847262 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:57.154648919 +0000 UTC m=+867.980559454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" (UID: "f38c0f28-9c07-459b-89e4-0aa2c7847262") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.511808 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerStarted","Data":"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24"} Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.515471 4681 generic.go:334] "Generic (PLEG): container finished" podID="ef78e840-4520-4de1-8abe-af82e052bfa3" containerID="356f7439be91cad59d29eb1558fd86da95ecc287ebe932db4b3f3bd46a18cd78" exitCode=0 Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.515510 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerDied","Data":"356f7439be91cad59d29eb1558fd86da95ecc287ebe932db4b3f3bd46a18cd78"} Jan 22 09:17:56 crc kubenswrapper[4681]: I0122 09:17:56.561030 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47zk5" podStartSLOduration=3.105758027 podStartE2EDuration="5.561013485s" podCreationTimestamp="2026-01-22 09:17:51 +0000 UTC" firstStartedPulling="2026-01-22 09:17:53.479815337 +0000 UTC m=+864.305725842" lastFinishedPulling="2026-01-22 09:17:55.935070795 +0000 UTC m=+866.760981300" observedRunningTime="2026-01-22 09:17:56.532726875 +0000 UTC m=+867.358637380" watchObservedRunningTime="2026-01-22 09:17:56.561013485 +0000 UTC m=+867.386923990" Jan 22 09:17:57 crc kubenswrapper[4681]: I0122 09:17:57.174424 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:57 crc kubenswrapper[4681]: I0122 09:17:57.180221 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f38c0f28-9c07-459b-89e4-0aa2c7847262-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr\" (UID: \"f38c0f28-9c07-459b-89e4-0aa2c7847262\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:57 crc kubenswrapper[4681]: I0122 09:17:57.201709 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.052647 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn"] Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.053782 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.055627 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.056780 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.061627 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn"] Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.185863 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/447b910c-7755-4492-87ef-fa6a55ee9698-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.185917 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.185960 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnk2\" (UniqueName: \"kubernetes.io/projected/447b910c-7755-4492-87ef-fa6a55ee9698-kube-api-access-tgnk2\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.186024 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/447b910c-7755-4492-87ef-fa6a55ee9698-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.186101 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.286940 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/447b910c-7755-4492-87ef-fa6a55ee9698-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.287008 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.287061 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnk2\" (UniqueName: \"kubernetes.io/projected/447b910c-7755-4492-87ef-fa6a55ee9698-kube-api-access-tgnk2\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.287104 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/447b910c-7755-4492-87ef-fa6a55ee9698-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.287149 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: E0122 09:17:58.287287 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 22 09:17:58 crc kubenswrapper[4681]: E0122 09:17:58.287350 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls podName:447b910c-7755-4492-87ef-fa6a55ee9698 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:58.78732915 +0000 UTC m=+869.613239665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" (UID: "447b910c-7755-4492-87ef-fa6a55ee9698") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.287536 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/447b910c-7755-4492-87ef-fa6a55ee9698-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.290969 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/447b910c-7755-4492-87ef-fa6a55ee9698-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.291695 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.304189 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnk2\" (UniqueName: \"kubernetes.io/projected/447b910c-7755-4492-87ef-fa6a55ee9698-kube-api-access-tgnk2\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: I0122 09:17:58.793014 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:58 crc kubenswrapper[4681]: E0122 09:17:58.793287 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 22 09:17:58 crc kubenswrapper[4681]: E0122 09:17:58.793347 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls podName:447b910c-7755-4492-87ef-fa6a55ee9698 nodeName:}" failed. No retries permitted until 2026-01-22 09:17:59.79332825 +0000 UTC m=+870.619238755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" (UID: "447b910c-7755-4492-87ef-fa6a55ee9698") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 22 09:17:59 crc kubenswrapper[4681]: I0122 09:17:59.808356 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:59 crc kubenswrapper[4681]: I0122 09:17:59.814505 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/447b910c-7755-4492-87ef-fa6a55ee9698-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn\" (UID: \"447b910c-7755-4492-87ef-fa6a55ee9698\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:17:59 crc kubenswrapper[4681]: I0122 09:17:59.870815 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.137318 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x"] Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.139773 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.146809 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.146817 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.160523 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x"] Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.236247 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.236523 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4d025fe-ae0e-48a3-bdba-35983685d558-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.236655 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.236815 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz9ct\" (UniqueName: \"kubernetes.io/projected/f4d025fe-ae0e-48a3-bdba-35983685d558-kube-api-access-cz9ct\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.236862 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f4d025fe-ae0e-48a3-bdba-35983685d558-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.338862 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz9ct\" (UniqueName: \"kubernetes.io/projected/f4d025fe-ae0e-48a3-bdba-35983685d558-kube-api-access-cz9ct\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.339306 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f4d025fe-ae0e-48a3-bdba-35983685d558-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.339336 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.339389 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4d025fe-ae0e-48a3-bdba-35983685d558-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.339423 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: E0122 09:18:01.339549 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 22 09:18:01 crc kubenswrapper[4681]: E0122 09:18:01.339602 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls podName:f4d025fe-ae0e-48a3-bdba-35983685d558 nodeName:}" failed. No retries permitted until 2026-01-22 09:18:01.83958595 +0000 UTC m=+872.665496455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" (UID: "f4d025fe-ae0e-48a3-bdba-35983685d558") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.340218 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4d025fe-ae0e-48a3-bdba-35983685d558-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.340503 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f4d025fe-ae0e-48a3-bdba-35983685d558-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.360142 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.362645 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz9ct\" (UniqueName: \"kubernetes.io/projected/f4d025fe-ae0e-48a3-bdba-35983685d558-kube-api-access-cz9ct\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.744312 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr"] Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.771899 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn"] Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.851735 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:01 crc kubenswrapper[4681]: E0122 09:18:01.851913 4681 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 22 09:18:01 crc kubenswrapper[4681]: E0122 09:18:01.851995 4681 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls podName:f4d025fe-ae0e-48a3-bdba-35983685d558 nodeName:}" failed. No retries permitted until 2026-01-22 09:18:02.85197689 +0000 UTC m=+873.677887395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" (UID: "f4d025fe-ae0e-48a3-bdba-35983685d558") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.947046 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:01 crc kubenswrapper[4681]: I0122 09:18:01.947841 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.001139 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.559758 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" event={"ID":"447b910c-7755-4492-87ef-fa6a55ee9698","Type":"ContainerStarted","Data":"9df463e8489bc06b7cb7a425f1bce96fe715eb06c46b2d08a225bc4d0f77d1f9"} Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.563103 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"187498e5-8d96-4912-8ad6-5a87ddca4a88","Type":"ContainerStarted","Data":"bfba542c7106db7ef01618e221ce1f3290ce55db47dcfab102fabb16f89e1e04"} Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.564704 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" event={"ID":"f38c0f28-9c07-459b-89e4-0aa2c7847262","Type":"ContainerStarted","Data":"ae0b530a2495bfbb93e1052c4f2569943690867168771004e9943a272d510102"} Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.588395 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.271878823 podStartE2EDuration="34.588382081s" podCreationTimestamp="2026-01-22 09:17:28 +0000 UTC" firstStartedPulling="2026-01-22 09:17:31.421124229 +0000 UTC m=+842.247034734" lastFinishedPulling="2026-01-22 09:18:01.737627487 +0000 UTC m=+872.563537992" observedRunningTime="2026-01-22 09:18:02.586654665 +0000 UTC m=+873.412565170" watchObservedRunningTime="2026-01-22 09:18:02.588382081 +0000 UTC m=+873.414292586" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.619473 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.657044 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.869995 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.882783 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4d025fe-ae0e-48a3-bdba-35983685d558-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x\" (UID: \"f4d025fe-ae0e-48a3-bdba-35983685d558\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:02 crc kubenswrapper[4681]: I0122 09:18:02.960012 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" Jan 22 09:18:03 crc kubenswrapper[4681]: I0122 09:18:03.479955 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x"] Jan 22 09:18:03 crc kubenswrapper[4681]: I0122 09:18:03.571818 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerStarted","Data":"d35cb3393c5ea6bea2459adb849c6d0c57c61262d576b2fb74a0ccd9054c36b6"} Jan 22 09:18:03 crc kubenswrapper[4681]: I0122 09:18:03.573654 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" event={"ID":"447b910c-7755-4492-87ef-fa6a55ee9698","Type":"ContainerStarted","Data":"57d75fb2fd1fac2544f3c05f5c13648f224fe3e4dd6fa0ec0be7a00d9bfd5aa8"} Jan 22 09:18:03 crc kubenswrapper[4681]: I0122 09:18:03.574741 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerStarted","Data":"4ceab8db3bbd0be00af3036ce0cf15e1018450161f4aaf6f0a1fbcfa7ee4c89c"} Jan 22 09:18:03 crc kubenswrapper[4681]: I0122 09:18:03.576614 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" event={"ID":"f38c0f28-9c07-459b-89e4-0aa2c7847262","Type":"ContainerStarted","Data":"721c43096425d64859f6768febebeaf8c4b64d02783c847cab9edb68eb8d25c4"} Jan 22 09:18:04 crc kubenswrapper[4681]: I0122 09:18:04.583845 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerStarted","Data":"730161e0df7aa48a6c657801c438d5b37b02148f0dfa9fd12631494f0585ae64"} Jan 22 09:18:04 crc kubenswrapper[4681]: I0122 09:18:04.584189 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47zk5" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="registry-server" containerID="cri-o://30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24" gracePeriod=2 Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.040626 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.235092 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c4ds\" (UniqueName: \"kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds\") pod \"97346e65-09fb-415b-a61c-34bbdb536f56\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.235226 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities\") pod \"97346e65-09fb-415b-a61c-34bbdb536f56\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.235290 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content\") pod \"97346e65-09fb-415b-a61c-34bbdb536f56\" (UID: \"97346e65-09fb-415b-a61c-34bbdb536f56\") " Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.236004 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities" (OuterVolumeSpecName: "utilities") pod "97346e65-09fb-415b-a61c-34bbdb536f56" (UID: "97346e65-09fb-415b-a61c-34bbdb536f56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.240124 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds" (OuterVolumeSpecName: "kube-api-access-6c4ds") pod "97346e65-09fb-415b-a61c-34bbdb536f56" (UID: "97346e65-09fb-415b-a61c-34bbdb536f56"). InnerVolumeSpecName "kube-api-access-6c4ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.294126 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97346e65-09fb-415b-a61c-34bbdb536f56" (UID: "97346e65-09fb-415b-a61c-34bbdb536f56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.341022 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c4ds\" (UniqueName: \"kubernetes.io/projected/97346e65-09fb-415b-a61c-34bbdb536f56-kube-api-access-6c4ds\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.341056 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.341067 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97346e65-09fb-415b-a61c-34bbdb536f56-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.593366 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerStarted","Data":"ead22e0f2fb226676e3d7cf7ddb50c8043dfe5ab297586bb993815e0cb38f873"} Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.595754 4681 generic.go:334] "Generic (PLEG): container finished" podID="97346e65-09fb-415b-a61c-34bbdb536f56" containerID="30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24" exitCode=0 Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.595803 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerDied","Data":"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24"} Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.595835 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zk5" event={"ID":"97346e65-09fb-415b-a61c-34bbdb536f56","Type":"ContainerDied","Data":"a42e25b3b8ae43ae24ed69d4491d34b6333becdddcb09f7793843f6da19f6619"} Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.595856 4681 scope.go:117] "RemoveContainer" containerID="30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.595882 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zk5" Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.627390 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.636014 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47zk5"] Jan 22 09:18:05 crc kubenswrapper[4681]: I0122 09:18:05.908722 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.692316 4681 scope.go:117] "RemoveContainer" containerID="d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.713745 4681 scope.go:117] "RemoveContainer" containerID="2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.741144 4681 scope.go:117] "RemoveContainer" containerID="30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24" Jan 22 09:18:06 crc kubenswrapper[4681]: E0122 09:18:06.741634 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24\": container with ID starting with 30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24 not found: ID does not exist" containerID="30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.741686 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24"} err="failed to get container status \"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24\": rpc error: code = NotFound desc = could not find container \"30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24\": container with ID starting with 30ba904e4efba7510edb4c1d418c3d1b0ab87738c49f64a110f0db516808aa24 not found: ID does not exist" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.741718 4681 scope.go:117] "RemoveContainer" containerID="d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22" Jan 22 09:18:06 crc kubenswrapper[4681]: E0122 09:18:06.742211 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22\": container with ID starting with d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22 not found: ID does not exist" containerID="d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.742236 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22"} err="failed to get container status \"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22\": rpc error: code = NotFound desc = could not find container \"d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22\": container with ID starting with d12d4b00b4aa761fe952ef7fb56489f3b5ea1cef56815dc41fe95c4c10fb8f22 not found: ID does not exist" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.742251 4681 scope.go:117] "RemoveContainer" containerID="2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a" Jan 22 09:18:06 crc kubenswrapper[4681]: E0122 09:18:06.742563 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a\": container with ID starting with 2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a not found: ID does not exist" containerID="2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a" Jan 22 09:18:06 crc kubenswrapper[4681]: I0122 09:18:06.742585 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a"} err="failed to get container status \"2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a\": rpc error: code = NotFound desc = could not find container \"2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a\": container with ID starting with 2daf0a2645c774d0080d58e0c282ed14dc3904eda43f78d58baf0fa094fe393a not found: ID does not exist" Jan 22 09:18:07 crc kubenswrapper[4681]: I0122 09:18:07.460522 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" path="/var/lib/kubelet/pods/97346e65-09fb-415b-a61c-34bbdb536f56/volumes" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.168693 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8"] Jan 22 09:18:08 crc kubenswrapper[4681]: E0122 09:18:08.169176 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="registry-server" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.169202 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="registry-server" Jan 22 09:18:08 crc kubenswrapper[4681]: E0122 09:18:08.169234 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="extract-utilities" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.169253 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="extract-utilities" Jan 22 09:18:08 crc kubenswrapper[4681]: E0122 09:18:08.169327 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="extract-content" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.169346 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="extract-content" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.169639 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="97346e65-09fb-415b-a61c-34bbdb536f56" containerName="registry-server" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.171160 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.174250 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.174598 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.182698 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8"] Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.281984 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa7f89fd-9692-40d1-85a5-54936ab44840-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.282052 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78px\" (UniqueName: \"kubernetes.io/projected/aa7f89fd-9692-40d1-85a5-54936ab44840-kube-api-access-p78px\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.282200 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa7f89fd-9692-40d1-85a5-54936ab44840-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.282247 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/aa7f89fd-9692-40d1-85a5-54936ab44840-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.383839 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa7f89fd-9692-40d1-85a5-54936ab44840-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.383955 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/aa7f89fd-9692-40d1-85a5-54936ab44840-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.384007 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa7f89fd-9692-40d1-85a5-54936ab44840-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.384063 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78px\" (UniqueName: \"kubernetes.io/projected/aa7f89fd-9692-40d1-85a5-54936ab44840-kube-api-access-p78px\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.384726 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa7f89fd-9692-40d1-85a5-54936ab44840-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.384834 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa7f89fd-9692-40d1-85a5-54936ab44840-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.398888 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/aa7f89fd-9692-40d1-85a5-54936ab44840-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.416284 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78px\" (UniqueName: \"kubernetes.io/projected/aa7f89fd-9692-40d1-85a5-54936ab44840-kube-api-access-p78px\") pod \"default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8\" (UID: \"aa7f89fd-9692-40d1-85a5-54936ab44840\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:08 crc kubenswrapper[4681]: I0122 09:18:08.491575 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" Jan 22 09:18:09 crc kubenswrapper[4681]: I0122 09:18:09.084813 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8"] Jan 22 09:18:09 crc kubenswrapper[4681]: I0122 09:18:09.729014 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" event={"ID":"aa7f89fd-9692-40d1-85a5-54936ab44840","Type":"ContainerStarted","Data":"7c8c4f022a2b0b308c99ab29b903e6694a236512590d6650a432b316375b0180"} Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.674276 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b"] Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.675588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.679532 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.700325 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b"] Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.751993 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsnm\" (UniqueName: \"kubernetes.io/projected/b152282d-9a3b-41c0-bb88-36de5b273303-kube-api-access-qpsnm\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.752033 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b152282d-9a3b-41c0-bb88-36de5b273303-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.752061 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/b152282d-9a3b-41c0-bb88-36de5b273303-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.752106 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b152282d-9a3b-41c0-bb88-36de5b273303-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.852870 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsnm\" (UniqueName: \"kubernetes.io/projected/b152282d-9a3b-41c0-bb88-36de5b273303-kube-api-access-qpsnm\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.852914 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b152282d-9a3b-41c0-bb88-36de5b273303-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.852945 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/b152282d-9a3b-41c0-bb88-36de5b273303-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.852989 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b152282d-9a3b-41c0-bb88-36de5b273303-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.853618 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/b152282d-9a3b-41c0-bb88-36de5b273303-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.853833 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/b152282d-9a3b-41c0-bb88-36de5b273303-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.859033 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/b152282d-9a3b-41c0-bb88-36de5b273303-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.868939 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsnm\" (UniqueName: \"kubernetes.io/projected/b152282d-9a3b-41c0-bb88-36de5b273303-kube-api-access-qpsnm\") pod \"default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b\" (UID: \"b152282d-9a3b-41c0-bb88-36de5b273303\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:11 crc kubenswrapper[4681]: I0122 09:18:11.991422 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" Jan 22 09:18:15 crc kubenswrapper[4681]: I0122 09:18:15.907954 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Jan 22 09:18:15 crc kubenswrapper[4681]: I0122 09:18:15.970059 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Jan 22 09:18:16 crc kubenswrapper[4681]: E0122 09:18:16.514446 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/sg-bridge:latest" Jan 22 09:18:16 crc kubenswrapper[4681]: E0122 09:18:16.514901 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:bridge,Image:quay.io/infrawatch/sg-bridge:latest,Command:[],Args:[--amqp_url amqp://default-interconnect.service-telemetry.svc.cluster.local:5673/anycast/ceilometer/cloud1-metering.sample --block --stat_period 60 --rbc 15000 --count 0 --gw_unix /tmp/smartgateway --rbs 16384],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tgnk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_service-telemetry(447b910c-7755-4492-87ef-fa6a55ee9698): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.533339 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b"] Jan 22 09:18:16 crc kubenswrapper[4681]: W0122 09:18:16.552301 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb152282d_9a3b_41c0_bb88_36de5b273303.slice/crio-fa6cefb198f7b8cf018ba6f3460495bd528269e31b882cbca40c8ac9b9c73510 WatchSource:0}: Error finding container fa6cefb198f7b8cf018ba6f3460495bd528269e31b882cbca40c8ac9b9c73510: Status 404 returned error can't find the container with id fa6cefb198f7b8cf018ba6f3460495bd528269e31b882cbca40c8ac9b9c73510 Jan 22 09:18:16 crc kubenswrapper[4681]: E0122 09:18:16.566003 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/sg-bridge:latest" Jan 22 09:18:16 crc kubenswrapper[4681]: E0122 09:18:16.566188 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:bridge,Image:quay.io/infrawatch/sg-bridge:latest,Command:[],Args:[--amqp_url amqp://default-interconnect.service-telemetry.svc.cluster.local:5673/collectd/cloud1-telemetry --block --stat_period 60 --rbc 15000 --count 0 --gw_unix /tmp/smartgateway --rbs 16384],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54cxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_service-telemetry(f38c0f28-9c07-459b-89e4-0aa2c7847262): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.851602 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" event={"ID":"b152282d-9a3b-41c0-bb88-36de5b273303","Type":"ContainerStarted","Data":"fa6cefb198f7b8cf018ba6f3460495bd528269e31b882cbca40c8ac9b9c73510"} Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.853316 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"ef78e840-4520-4de1-8abe-af82e052bfa3","Type":"ContainerStarted","Data":"a994516a1117496f017ecccb99ce264cec0965b822ae9fef4fe3fef0d2749f63"} Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.855314 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerStarted","Data":"9500d4799a82439bc4fe156270432417ebcd60b5aefcc090ba429216ccbae483"} Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.856733 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" event={"ID":"aa7f89fd-9692-40d1-85a5-54936ab44840","Type":"ContainerStarted","Data":"0c3c6e344cc04676794297a5187c2e0b537bf827dc22a5eca96fb9d25f70bc2b"} Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.918593 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Jan 22 09:18:16 crc kubenswrapper[4681]: I0122 09:18:16.946323 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.015028156 podStartE2EDuration="35.946305506s" podCreationTimestamp="2026-01-22 09:17:41 +0000 UTC" firstStartedPulling="2026-01-22 09:17:56.517153342 +0000 UTC m=+867.343063847" lastFinishedPulling="2026-01-22 09:18:16.448430692 +0000 UTC m=+887.274341197" observedRunningTime="2026-01-22 09:18:16.886550971 +0000 UTC m=+887.712461476" watchObservedRunningTime="2026-01-22 09:18:16.946305506 +0000 UTC m=+887.772216031" Jan 22 09:18:17 crc kubenswrapper[4681]: I0122 09:18:17.866189 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" event={"ID":"b152282d-9a3b-41c0-bb88-36de5b273303","Type":"ContainerStarted","Data":"6024b6a4d034b554adeb86d4e03e90783a11784c706fa5ef522ef63aee87cf15"} Jan 22 09:18:22 crc kubenswrapper[4681]: I0122 09:18:22.870991 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:18:22 crc kubenswrapper[4681]: I0122 09:18:22.871715 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" podUID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" containerName="default-interconnect" containerID="cri-o://d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b" gracePeriod=30 Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.249218 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.263101 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.263144 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbthx\" (UniqueName: \"kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.267702 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.267814 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.267875 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.267905 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.267934 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users\") pod \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\" (UID: \"0f9dbef0-752d-4587-ae43-d5b405a24a7d\") " Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.269246 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.275594 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.281458 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.281579 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.282220 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.285484 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.302977 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx" (OuterVolumeSpecName: "kube-api-access-fbthx") pod "0f9dbef0-752d-4587-ae43-d5b405a24a7d" (UID: "0f9dbef0-752d-4587-ae43-d5b405a24a7d"). InnerVolumeSpecName "kube-api-access-fbthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369616 4681 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369658 4681 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369673 4681 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369689 4681 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369698 4681 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-sasl-users\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369706 4681 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/0f9dbef0-752d-4587-ae43-d5b405a24a7d-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.369716 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbthx\" (UniqueName: \"kubernetes.io/projected/0f9dbef0-752d-4587-ae43-d5b405a24a7d-kube-api-access-fbthx\") on node \"crc\" DevicePath \"\"" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.911801 4681 generic.go:334] "Generic (PLEG): container finished" podID="aa7f89fd-9692-40d1-85a5-54936ab44840" containerID="0c3c6e344cc04676794297a5187c2e0b537bf827dc22a5eca96fb9d25f70bc2b" exitCode=0 Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.911897 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" event={"ID":"aa7f89fd-9692-40d1-85a5-54936ab44840","Type":"ContainerDied","Data":"0c3c6e344cc04676794297a5187c2e0b537bf827dc22a5eca96fb9d25f70bc2b"} Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.914301 4681 generic.go:334] "Generic (PLEG): container finished" podID="b152282d-9a3b-41c0-bb88-36de5b273303" containerID="6024b6a4d034b554adeb86d4e03e90783a11784c706fa5ef522ef63aee87cf15" exitCode=0 Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.914382 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" event={"ID":"b152282d-9a3b-41c0-bb88-36de5b273303","Type":"ContainerDied","Data":"6024b6a4d034b554adeb86d4e03e90783a11784c706fa5ef522ef63aee87cf15"} Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.916284 4681 generic.go:334] "Generic (PLEG): container finished" podID="f4d025fe-ae0e-48a3-bdba-35983685d558" containerID="9500d4799a82439bc4fe156270432417ebcd60b5aefcc090ba429216ccbae483" exitCode=0 Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.916334 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerDied","Data":"9500d4799a82439bc4fe156270432417ebcd60b5aefcc090ba429216ccbae483"} Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.918615 4681 generic.go:334] "Generic (PLEG): container finished" podID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" containerID="d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b" exitCode=0 Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.918643 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" event={"ID":"0f9dbef0-752d-4587-ae43-d5b405a24a7d","Type":"ContainerDied","Data":"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b"} Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.918658 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" event={"ID":"0f9dbef0-752d-4587-ae43-d5b405a24a7d","Type":"ContainerDied","Data":"5e749288731eb69c1508916b718024efa5e213537c6ff70f0936d27968793352"} Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.918673 4681 scope.go:117] "RemoveContainer" containerID="d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.918774 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-r5tvj" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.942411 4681 scope.go:117] "RemoveContainer" containerID="d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b" Jan 22 09:18:23 crc kubenswrapper[4681]: E0122 09:18:23.942791 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b\": container with ID starting with d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b not found: ID does not exist" containerID="d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.942839 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b"} err="failed to get container status \"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b\": rpc error: code = NotFound desc = could not find container \"d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b\": container with ID starting with d099710a840bc24a224e2e6ef77decf9772067ea6144a503c25acecdfe7ebb4b not found: ID does not exist" Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.943677 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:18:23 crc kubenswrapper[4681]: I0122 09:18:23.947779 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-r5tvj"] Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.608740 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rsshc"] Jan 22 09:18:24 crc kubenswrapper[4681]: E0122 09:18:24.609293 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" containerName="default-interconnect" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.609310 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" containerName="default-interconnect" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.609458 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" containerName="default-interconnect" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.609901 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.614147 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.614166 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-qhl2k" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.614430 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.614787 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.615038 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.615599 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.616493 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.632487 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rsshc"] Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697483 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697564 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-config\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697589 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697604 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnp92\" (UniqueName: \"kubernetes.io/projected/76112dc3-3650-4767-ae53-7f11a6baac67-kube-api-access-jnp92\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697630 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-users\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697653 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.697702 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799136 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799182 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799232 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-config\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799260 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799320 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnp92\" (UniqueName: \"kubernetes.io/projected/76112dc3-3650-4767-ae53-7f11a6baac67-kube-api-access-jnp92\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799344 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-users\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.799370 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.800680 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-config\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.803291 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-sasl-users\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.803643 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.804125 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.812668 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.817571 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnp92\" (UniqueName: \"kubernetes.io/projected/76112dc3-3650-4767-ae53-7f11a6baac67-kube-api-access-jnp92\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.818472 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/76112dc3-3650-4767-ae53-7f11a6baac67-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rsshc\" (UID: \"76112dc3-3650-4767-ae53-7f11a6baac67\") " pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:24 crc kubenswrapper[4681]: I0122 09:18:24.926233 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rsshc" Jan 22 09:18:25 crc kubenswrapper[4681]: I0122 09:18:25.438625 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rsshc"] Jan 22 09:18:25 crc kubenswrapper[4681]: I0122 09:18:25.478189 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9dbef0-752d-4587-ae43-d5b405a24a7d" path="/var/lib/kubelet/pods/0f9dbef0-752d-4587-ae43-d5b405a24a7d/volumes" Jan 22 09:18:25 crc kubenswrapper[4681]: I0122 09:18:25.936442 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rsshc" event={"ID":"76112dc3-3650-4767-ae53-7f11a6baac67","Type":"ContainerStarted","Data":"63fd62f4e54b5297377b27e941b6411b8d67f941d6756d6c8b63adba91068a70"} Jan 22 09:18:25 crc kubenswrapper[4681]: I0122 09:18:25.936483 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rsshc" event={"ID":"76112dc3-3650-4767-ae53-7f11a6baac67","Type":"ContainerStarted","Data":"f906bad74acf1f05fda9d6d9cb90aba4498e68bebe2d1d6a6f3d6876fc46761e"} Jan 22 09:18:25 crc kubenswrapper[4681]: I0122 09:18:25.956981 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-rsshc" podStartSLOduration=3.956951471 podStartE2EDuration="3.956951471s" podCreationTimestamp="2026-01-22 09:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:18:25.956101289 +0000 UTC m=+896.782011804" watchObservedRunningTime="2026-01-22 09:18:25.956951471 +0000 UTC m=+896.782861976" Jan 22 09:18:26 crc kubenswrapper[4681]: I0122 09:18:26.030892 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:18:26 crc kubenswrapper[4681]: I0122 09:18:26.031213 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:18:29 crc kubenswrapper[4681]: E0122 09:18:29.690842 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" podUID="447b910c-7755-4492-87ef-fa6a55ee9698" Jan 22 09:18:29 crc kubenswrapper[4681]: E0122 09:18:29.691941 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" podUID="f38c0f28-9c07-459b-89e4-0aa2c7847262" Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.972276 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" event={"ID":"b152282d-9a3b-41c0-bb88-36de5b273303","Type":"ContainerStarted","Data":"0a878bc7abfb19a22a894d26a8e74c2b747113265a3edde69b19e06973153bfb"} Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.972724 4681 scope.go:117] "RemoveContainer" containerID="6024b6a4d034b554adeb86d4e03e90783a11784c706fa5ef522ef63aee87cf15" Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.974150 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" event={"ID":"447b910c-7755-4492-87ef-fa6a55ee9698","Type":"ContainerStarted","Data":"c91570efe39e5b715fb5c85f103359d7bb62d31d2bd4a41f5f38ddf86bbf1e52"} Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.977068 4681 scope.go:117] "RemoveContainer" containerID="9500d4799a82439bc4fe156270432417ebcd60b5aefcc090ba429216ccbae483" Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.977071 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerStarted","Data":"ec5a576121fee250116806f2052b82da13a68290cd8202ddd9bf0be27dbe55ed"} Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.979750 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" event={"ID":"aa7f89fd-9692-40d1-85a5-54936ab44840","Type":"ContainerStarted","Data":"bcda5af12a5cdf85e604bdbfd8be6d2ca8f7089106c3cb398ee8cff18a1e1077"} Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.980103 4681 scope.go:117] "RemoveContainer" containerID="0c3c6e344cc04676794297a5187c2e0b537bf827dc22a5eca96fb9d25f70bc2b" Jan 22 09:18:29 crc kubenswrapper[4681]: I0122 09:18:29.983392 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" event={"ID":"f38c0f28-9c07-459b-89e4-0aa2c7847262","Type":"ContainerStarted","Data":"ab7d3454baf49e2f8fe1926e0949abe3b752ab1ad0063aefdb516673d575206e"} Jan 22 09:18:30 crc kubenswrapper[4681]: I0122 09:18:30.999631 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" event={"ID":"f4d025fe-ae0e-48a3-bdba-35983685d558","Type":"ContainerStarted","Data":"2c054cd64e926ed850f97d5b63349070c2b23373e0965d9f1360c9b844cd3012"} Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.004802 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" event={"ID":"aa7f89fd-9692-40d1-85a5-54936ab44840","Type":"ContainerStarted","Data":"c50a7e907b405266938eec812e48aa8589df1b87e3333af0673c58b455ce5fd6"} Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.007011 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" event={"ID":"f38c0f28-9c07-459b-89e4-0aa2c7847262","Type":"ContainerStarted","Data":"90501252bc51d3d3fd938630c3bee09b5d7aa0856fd17a042a8db249984ab7ab"} Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.009392 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" event={"ID":"b152282d-9a3b-41c0-bb88-36de5b273303","Type":"ContainerStarted","Data":"9fc40e2cce48caa3a33b824df93b4bdb6342db69b4cca16a4d5942eb0cf6a77f"} Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.011210 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" event={"ID":"447b910c-7755-4492-87ef-fa6a55ee9698","Type":"ContainerStarted","Data":"f1ddc95eabdf644cf70bb288f61b12cb1d69c643d41d33bf41ce18de528e9115"} Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.022007 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x" podStartSLOduration=3.108989839 podStartE2EDuration="30.021989524s" podCreationTimestamp="2026-01-22 09:18:01 +0000 UTC" firstStartedPulling="2026-01-22 09:18:03.493066313 +0000 UTC m=+874.318976818" lastFinishedPulling="2026-01-22 09:18:30.406065998 +0000 UTC m=+901.231976503" observedRunningTime="2026-01-22 09:18:31.017597237 +0000 UTC m=+901.843507772" watchObservedRunningTime="2026-01-22 09:18:31.021989524 +0000 UTC m=+901.847900039" Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.040739 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b" podStartSLOduration=6.156355257 podStartE2EDuration="20.04071585s" podCreationTimestamp="2026-01-22 09:18:11 +0000 UTC" firstStartedPulling="2026-01-22 09:18:16.585420425 +0000 UTC m=+887.411330930" lastFinishedPulling="2026-01-22 09:18:30.469781018 +0000 UTC m=+901.295691523" observedRunningTime="2026-01-22 09:18:31.034721591 +0000 UTC m=+901.860632136" watchObservedRunningTime="2026-01-22 09:18:31.04071585 +0000 UTC m=+901.866626385" Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.063416 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn" podStartSLOduration=4.443675492 podStartE2EDuration="33.063400412s" podCreationTimestamp="2026-01-22 09:17:58 +0000 UTC" firstStartedPulling="2026-01-22 09:18:01.798760358 +0000 UTC m=+872.624670863" lastFinishedPulling="2026-01-22 09:18:30.418485278 +0000 UTC m=+901.244395783" observedRunningTime="2026-01-22 09:18:31.057985498 +0000 UTC m=+901.883896013" watchObservedRunningTime="2026-01-22 09:18:31.063400412 +0000 UTC m=+901.889310927" Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.100228 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr" podStartSLOduration=7.366731992 podStartE2EDuration="36.100139366s" podCreationTimestamp="2026-01-22 09:17:55 +0000 UTC" firstStartedPulling="2026-01-22 09:18:01.754542436 +0000 UTC m=+872.580452941" lastFinishedPulling="2026-01-22 09:18:30.48794981 +0000 UTC m=+901.313860315" observedRunningTime="2026-01-22 09:18:31.091065166 +0000 UTC m=+901.916975711" watchObservedRunningTime="2026-01-22 09:18:31.100139366 +0000 UTC m=+901.926049901" Jan 22 09:18:31 crc kubenswrapper[4681]: I0122 09:18:31.114922 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8" podStartSLOduration=1.680074083 podStartE2EDuration="23.114905118s" podCreationTimestamp="2026-01-22 09:18:08 +0000 UTC" firstStartedPulling="2026-01-22 09:18:09.088897574 +0000 UTC m=+879.914808079" lastFinishedPulling="2026-01-22 09:18:30.523728609 +0000 UTC m=+901.349639114" observedRunningTime="2026-01-22 09:18:31.110227374 +0000 UTC m=+901.936137919" watchObservedRunningTime="2026-01-22 09:18:31.114905118 +0000 UTC m=+901.940815633" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.485339 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.486513 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.491281 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.491556 4681 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.500656 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.568960 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/7790a75e-2dfd-4f34-8327-ca553ec25c44-qdr-test-config\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.569096 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/7790a75e-2dfd-4f34-8327-ca553ec25c44-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.569166 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnwcs\" (UniqueName: \"kubernetes.io/projected/7790a75e-2dfd-4f34-8327-ca553ec25c44-kube-api-access-dnwcs\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.670903 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/7790a75e-2dfd-4f34-8327-ca553ec25c44-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.671009 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnwcs\" (UniqueName: \"kubernetes.io/projected/7790a75e-2dfd-4f34-8327-ca553ec25c44-kube-api-access-dnwcs\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.671092 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/7790a75e-2dfd-4f34-8327-ca553ec25c44-qdr-test-config\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.671816 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/7790a75e-2dfd-4f34-8327-ca553ec25c44-qdr-test-config\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.677077 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/7790a75e-2dfd-4f34-8327-ca553ec25c44-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.688773 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnwcs\" (UniqueName: \"kubernetes.io/projected/7790a75e-2dfd-4f34-8327-ca553ec25c44-kube-api-access-dnwcs\") pod \"qdr-test\" (UID: \"7790a75e-2dfd-4f34-8327-ca553ec25c44\") " pod="service-telemetry/qdr-test" Jan 22 09:18:40 crc kubenswrapper[4681]: I0122 09:18:40.803382 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 22 09:18:41 crc kubenswrapper[4681]: I0122 09:18:41.241804 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 22 09:18:41 crc kubenswrapper[4681]: W0122 09:18:41.247488 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7790a75e_2dfd_4f34_8327_ca553ec25c44.slice/crio-5fb5c2a55244db43ad84fc8d94e7f3339c74c252876386d3ebd052589622c0d4 WatchSource:0}: Error finding container 5fb5c2a55244db43ad84fc8d94e7f3339c74c252876386d3ebd052589622c0d4: Status 404 returned error can't find the container with id 5fb5c2a55244db43ad84fc8d94e7f3339c74c252876386d3ebd052589622c0d4 Jan 22 09:18:42 crc kubenswrapper[4681]: I0122 09:18:42.090224 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"7790a75e-2dfd-4f34-8327-ca553ec25c44","Type":"ContainerStarted","Data":"5fb5c2a55244db43ad84fc8d94e7f3339c74c252876386d3ebd052589622c0d4"} Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.162380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"7790a75e-2dfd-4f34-8327-ca553ec25c44","Type":"ContainerStarted","Data":"1bf035ba9c77350acca5d5ec5190a57fcbd4311657ba2b22dc52f3fccd6df21c"} Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.202759 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.9365044729999998 podStartE2EDuration="9.202732774s" podCreationTimestamp="2026-01-22 09:18:40 +0000 UTC" firstStartedPulling="2026-01-22 09:18:41.250836477 +0000 UTC m=+912.076746982" lastFinishedPulling="2026-01-22 09:18:48.517064768 +0000 UTC m=+919.342975283" observedRunningTime="2026-01-22 09:18:49.197246039 +0000 UTC m=+920.023156544" watchObservedRunningTime="2026-01-22 09:18:49.202732774 +0000 UTC m=+920.028643289" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.503221 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-8tmfj"] Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.504161 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.506730 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.506904 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.507256 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.509787 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.509801 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.510109 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512169 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512245 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512315 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512392 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512427 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mqn\" (UniqueName: \"kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512558 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.512618 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.519519 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-8tmfj"] Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.613980 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614038 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614078 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614097 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mqn\" (UniqueName: \"kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614137 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614173 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.614209 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.615378 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.615992 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.616561 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.617113 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.618129 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.618893 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.642380 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mqn\" (UniqueName: \"kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn\") pod \"stf-smoketest-smoke1-8tmfj\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.824582 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.917997 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.919733 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 22 09:18:49 crc kubenswrapper[4681]: I0122 09:18:49.936909 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.027080 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84s74\" (UniqueName: \"kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74\") pod \"curl\" (UID: \"420a963d-c484-47e8-8f00-06ecb15e323e\") " pod="service-telemetry/curl" Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.128514 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84s74\" (UniqueName: \"kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74\") pod \"curl\" (UID: \"420a963d-c484-47e8-8f00-06ecb15e323e\") " pod="service-telemetry/curl" Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.147761 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84s74\" (UniqueName: \"kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74\") pod \"curl\" (UID: \"420a963d-c484-47e8-8f00-06ecb15e323e\") " pod="service-telemetry/curl" Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.263642 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.387208 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-8tmfj"] Jan 22 09:18:50 crc kubenswrapper[4681]: W0122 09:18:50.400492 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod042bf4f4_849a_41a1_8bc4_650995af5d74.slice/crio-f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428 WatchSource:0}: Error finding container f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428: Status 404 returned error can't find the container with id f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428 Jan 22 09:18:50 crc kubenswrapper[4681]: I0122 09:18:50.493491 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 22 09:18:50 crc kubenswrapper[4681]: W0122 09:18:50.501000 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod420a963d_c484_47e8_8f00_06ecb15e323e.slice/crio-65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee WatchSource:0}: Error finding container 65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee: Status 404 returned error can't find the container with id 65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee Jan 22 09:18:51 crc kubenswrapper[4681]: I0122 09:18:51.176586 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"420a963d-c484-47e8-8f00-06ecb15e323e","Type":"ContainerStarted","Data":"65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee"} Jan 22 09:18:51 crc kubenswrapper[4681]: I0122 09:18:51.178786 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerStarted","Data":"f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428"} Jan 22 09:18:53 crc kubenswrapper[4681]: I0122 09:18:53.195187 4681 generic.go:334] "Generic (PLEG): container finished" podID="420a963d-c484-47e8-8f00-06ecb15e323e" containerID="006542a74c0dabe93760c70333d5e97cf41ad72a6d6b1a2f6dc8856c3875cd3e" exitCode=0 Jan 22 09:18:53 crc kubenswrapper[4681]: I0122 09:18:53.195513 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"420a963d-c484-47e8-8f00-06ecb15e323e","Type":"ContainerDied","Data":"006542a74c0dabe93760c70333d5e97cf41ad72a6d6b1a2f6dc8856c3875cd3e"} Jan 22 09:18:56 crc kubenswrapper[4681]: I0122 09:18:56.034585 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:18:56 crc kubenswrapper[4681]: I0122 09:18:56.035169 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:18:56 crc kubenswrapper[4681]: I0122 09:18:56.036308 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:18:56 crc kubenswrapper[4681]: I0122 09:18:56.220459 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:18:56 crc kubenswrapper[4681]: I0122 09:18:56.220570 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4" gracePeriod=600 Jan 22 09:19:00 crc kubenswrapper[4681]: I0122 09:19:00.725103 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 22 09:19:00 crc kubenswrapper[4681]: I0122 09:19:00.796279 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84s74\" (UniqueName: \"kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74\") pod \"420a963d-c484-47e8-8f00-06ecb15e323e\" (UID: \"420a963d-c484-47e8-8f00-06ecb15e323e\") " Jan 22 09:19:00 crc kubenswrapper[4681]: I0122 09:19:00.820935 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74" (OuterVolumeSpecName: "kube-api-access-84s74") pod "420a963d-c484-47e8-8f00-06ecb15e323e" (UID: "420a963d-c484-47e8-8f00-06ecb15e323e"). InnerVolumeSpecName "kube-api-access-84s74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:19:00 crc kubenswrapper[4681]: I0122 09:19:00.871201 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_420a963d-c484-47e8-8f00-06ecb15e323e/curl/0.log" Jan 22 09:19:00 crc kubenswrapper[4681]: I0122 09:19:00.897945 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84s74\" (UniqueName: \"kubernetes.io/projected/420a963d-c484-47e8-8f00-06ecb15e323e-kube-api-access-84s74\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.129014 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-dh6q6_3f65bf37-6f6e-4117-b952-d8d859f01094/prometheus-webhook-snmp/0.log" Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.273151 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"420a963d-c484-47e8-8f00-06ecb15e323e","Type":"ContainerDied","Data":"65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee"} Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.273171 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.273191 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65369348cf3edde5612a11484e4d75a54d73c4c0b3d477ef20f82356fad946ee" Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.275004 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4" exitCode=0 Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.275044 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4"} Jan 22 09:19:01 crc kubenswrapper[4681]: I0122 09:19:01.275081 4681 scope.go:117] "RemoveContainer" containerID="6ac7ebdaf79be25ddb77d35c84045ee94bb99bba54c6d511a9e4c0510347ef3c" Jan 22 09:19:05 crc kubenswrapper[4681]: E0122 09:19:05.858438 4681 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleomastercentos9/openstack-collectd:current-tripleo" Jan 22 09:19:05 crc kubenswrapper[4681]: E0122 09:19:05.859092 4681 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:smoketest-collectd,Image:quay.io/tripleomastercentos9/openstack-collectd:current-tripleo,Command:[/smoketest_collectd_entrypoint.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLOUDNAME,Value:smoke1,ValueFrom:nil,},EnvVar{Name:ELASTICSEARCH_AUTH_PASS,Value:OCB0sx5v2uYazhhLx2gyeTsE,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_AUTH_TOKEN,Value:eyJhbGciOiJSUzI1NiIsImtpZCI6InF6SnFxNFFjbVk5VmJQZ2dNMmUxdHFmTlJlVWx4UDhSTlhIamV3RUx4WU0ifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjIl0sImV4cCI6MTc2OTA3NzExNSwiaWF0IjoxNzY5MDczNTE1LCJpc3MiOiJodHRwczovL2t1YmVybmV0ZXMuZGVmYXVsdC5zdmMiLCJqdGkiOiI0YjNmNGY2Yy01NTdkLTQwMmMtYmQ2Ni00OTQyZTRkYTU2NWUiLCJrdWJlcm5ldGVzLmlvIjp7Im5hbWVzcGFjZSI6InNlcnZpY2UtdGVsZW1ldHJ5Iiwic2VydmljZWFjY291bnQiOnsibmFtZSI6InN0Zi1wcm9tZXRoZXVzLXJlYWRlciIsInVpZCI6Ijk2NmFiZDQxLWI3ODItNDE0Mi05YzAwLTEyZDRmZjEwNDQ0MSJ9fSwibmJmIjoxNzY5MDczNTE1LCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6c2VydmljZS10ZWxlbWV0cnk6c3RmLXByb21ldGhldXMtcmVhZGVyIn0.ifaZJ7OffepwmdBF560Y3D2lrXCJSc1CQJ80G3VWZSGDOy7YeCUctx-_1GMlv1GQRuGfhX7M3O21iUQiTaE1M_O8ouKN9WwNA2AaaEB-GUiwmqIbkVewgy3D6ht4wwmXaNLJAYs_57cnJ1Soec1ZiJ_Wa9LOKkegVqAWC14Nj805SxYJOuVz2NCLe9XzWZjCxqh1aa4VxwzqzjNJ_UEpGxzRzOQq0J5Td01UFhJI_lUGVvcdB-qiU6A1M6_LtoVlB65m7EdYl4JHkx_5vOemHtREl557Su0pXPQfPtUNFPfcrwbvgthPDHz4nv1EbKrIjtWmJYYtpsgfHJyYd09PpKHt5qQubrlSM_bHhOtV8SoC8uDkqzcR6WK_4OKtjvPzPJHrKZjHn_sPthl0IlMrG3kOSxUv7fzBxdjccYa6BerRjVuTMp4jKEtdHu2ZRaVXN4Q05d2SXzEbhW9YSmmiC2cZSBJAhdYDXlHGpQFbcbd-1lkarpehQB7QsP70VT35P22vclXTf2OE4uTsCpuY6VdZGIpDU7hzi9WxhKhYiDLxpZp3QCqOkuLp_d1I3fnZoXX1OFbvF-mjTQzYm8sC3wctoJgRfs0esRID5XeUCLDywAnj3myiMIaNeZ4U_hxKUN38EMmOf_urvO3BYrPtF8Hm3R4JImNMPD-cQ1Kclew,ValueFrom:nil,},EnvVar{Name:OBSERVABILITY_STRATEGY,Value:<>,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:collectd-config,ReadOnly:false,MountPath:/etc/minimal-collectd.conf.template,SubPath:minimal-collectd.conf.template,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sensubility-config,ReadOnly:false,MountPath:/etc/collectd-sensubility.conf,SubPath:collectd-sensubility.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:healthcheck-log,ReadOnly:false,MountPath:/healthcheck.log,SubPath:healthcheck.log,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:collectd-entrypoint-script,ReadOnly:false,MountPath:/smoketest_collectd_entrypoint.sh,SubPath:smoketest_collectd_entrypoint.sh,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8mqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod stf-smoketest-smoke1-8tmfj_service-telemetry(042bf4f4-849a-41a1-8bc4-650995af5d74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 22 09:19:06 crc kubenswrapper[4681]: I0122 09:19:06.315338 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b"} Jan 22 09:19:16 crc kubenswrapper[4681]: E0122 09:19:16.504564 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" Jan 22 09:19:17 crc kubenswrapper[4681]: I0122 09:19:17.408343 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerStarted","Data":"58e7ee10c61518d2ba160ad935f1a9d94d3670ec7bcef1668e00ee91ae586fae"} Jan 22 09:19:18 crc kubenswrapper[4681]: I0122 09:19:18.423328 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerStarted","Data":"8fbc2cf352227532dc9eabe1a529314cc4db7c45e5c4d650db9004a9daf2f905"} Jan 22 09:19:31 crc kubenswrapper[4681]: I0122 09:19:31.272805 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-dh6q6_3f65bf37-6f6e-4117-b952-d8d859f01094/prometheus-webhook-snmp/0.log" Jan 22 09:19:48 crc kubenswrapper[4681]: I0122 09:19:48.663778 4681 generic.go:334] "Generic (PLEG): container finished" podID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerID="58e7ee10c61518d2ba160ad935f1a9d94d3670ec7bcef1668e00ee91ae586fae" exitCode=0 Jan 22 09:19:48 crc kubenswrapper[4681]: I0122 09:19:48.663985 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerDied","Data":"58e7ee10c61518d2ba160ad935f1a9d94d3670ec7bcef1668e00ee91ae586fae"} Jan 22 09:19:48 crc kubenswrapper[4681]: I0122 09:19:48.664962 4681 scope.go:117] "RemoveContainer" containerID="58e7ee10c61518d2ba160ad935f1a9d94d3670ec7bcef1668e00ee91ae586fae" Jan 22 09:19:51 crc kubenswrapper[4681]: I0122 09:19:51.689959 4681 generic.go:334] "Generic (PLEG): container finished" podID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerID="8fbc2cf352227532dc9eabe1a529314cc4db7c45e5c4d650db9004a9daf2f905" exitCode=0 Jan 22 09:19:51 crc kubenswrapper[4681]: I0122 09:19:51.690461 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerDied","Data":"8fbc2cf352227532dc9eabe1a529314cc4db7c45e5c4d650db9004a9daf2f905"} Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.008859 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.168414 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.168549 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.168636 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.168697 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.168729 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.169442 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.169515 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8mqn\" (UniqueName: \"kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn\") pod \"042bf4f4-849a-41a1-8bc4-650995af5d74\" (UID: \"042bf4f4-849a-41a1-8bc4-650995af5d74\") " Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.178949 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn" (OuterVolumeSpecName: "kube-api-access-f8mqn") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "kube-api-access-f8mqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.186087 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.187069 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.192543 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.194707 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.197780 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.198370 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "042bf4f4-849a-41a1-8bc4-650995af5d74" (UID: "042bf4f4-849a-41a1-8bc4-650995af5d74"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271462 4681 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271535 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271557 4681 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-healthcheck-log\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271577 4681 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271596 4681 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-sensubility-config\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271612 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8mqn\" (UniqueName: \"kubernetes.io/projected/042bf4f4-849a-41a1-8bc4-650995af5d74-kube-api-access-f8mqn\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.271629 4681 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/042bf4f4-849a-41a1-8bc4-650995af5d74-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.707507 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" event={"ID":"042bf4f4-849a-41a1-8bc4-650995af5d74","Type":"ContainerDied","Data":"f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428"} Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.707565 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f546399850b41e545481025deed4d1f372e17978324691c715b6db84eb61a428" Jan 22 09:19:53 crc kubenswrapper[4681]: I0122 09:19:53.707602 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-8tmfj" Jan 22 09:19:55 crc kubenswrapper[4681]: I0122 09:19:55.054283 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-8tmfj_042bf4f4-849a-41a1-8bc4-650995af5d74/smoketest-collectd/0.log" Jan 22 09:19:55 crc kubenswrapper[4681]: I0122 09:19:55.388476 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-8tmfj_042bf4f4-849a-41a1-8bc4-650995af5d74/smoketest-ceilometer/0.log" Jan 22 09:19:55 crc kubenswrapper[4681]: I0122 09:19:55.630675 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-rsshc_76112dc3-3650-4767-ae53-7f11a6baac67/default-interconnect/0.log" Jan 22 09:19:55 crc kubenswrapper[4681]: I0122 09:19:55.879558 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_f38c0f28-9c07-459b-89e4-0aa2c7847262/bridge/0.log" Jan 22 09:19:56 crc kubenswrapper[4681]: I0122 09:19:56.143441 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_f38c0f28-9c07-459b-89e4-0aa2c7847262/sg-core/0.log" Jan 22 09:19:56 crc kubenswrapper[4681]: I0122 09:19:56.389945 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8_aa7f89fd-9692-40d1-85a5-54936ab44840/bridge/1.log" Jan 22 09:19:56 crc kubenswrapper[4681]: I0122 09:19:56.642839 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8_aa7f89fd-9692-40d1-85a5-54936ab44840/sg-core/0.log" Jan 22 09:19:56 crc kubenswrapper[4681]: I0122 09:19:56.937919 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_447b910c-7755-4492-87ef-fa6a55ee9698/bridge/0.log" Jan 22 09:19:57 crc kubenswrapper[4681]: I0122 09:19:57.166645 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_447b910c-7755-4492-87ef-fa6a55ee9698/sg-core/0.log" Jan 22 09:19:57 crc kubenswrapper[4681]: I0122 09:19:57.411622 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b_b152282d-9a3b-41c0-bb88-36de5b273303/bridge/1.log" Jan 22 09:19:57 crc kubenswrapper[4681]: I0122 09:19:57.654000 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b_b152282d-9a3b-41c0-bb88-36de5b273303/sg-core/0.log" Jan 22 09:19:57 crc kubenswrapper[4681]: I0122 09:19:57.893844 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/bridge/1.log" Jan 22 09:19:58 crc kubenswrapper[4681]: I0122 09:19:58.132771 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/sg-core/0.log" Jan 22 09:20:00 crc kubenswrapper[4681]: I0122 09:20:00.286986 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bbbc889bc-859fl_91a6ad3b-58ad-4095-97df-be878b439ac6/operator/0.log" Jan 22 09:20:00 crc kubenswrapper[4681]: I0122 09:20:00.540299 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_187498e5-8d96-4912-8ad6-5a87ddca4a88/prometheus/0.log" Jan 22 09:20:00 crc kubenswrapper[4681]: I0122 09:20:00.794100 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_44b13aea-bee1-4576-87b4-d41165fcc2fa/elasticsearch/0.log" Jan 22 09:20:01 crc kubenswrapper[4681]: I0122 09:20:01.046042 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-dh6q6_3f65bf37-6f6e-4117-b952-d8d859f01094/prometheus-webhook-snmp/0.log" Jan 22 09:20:01 crc kubenswrapper[4681]: I0122 09:20:01.299640 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_ef78e840-4520-4de1-8abe-af82e052bfa3/alertmanager/0.log" Jan 22 09:20:14 crc kubenswrapper[4681]: I0122 09:20:14.254064 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-55b89ddfb9-748rx_3f0e9240-f934-448c-8027-b5d44f6ca38c/operator/0.log" Jan 22 09:20:16 crc kubenswrapper[4681]: I0122 09:20:16.528403 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bbbc889bc-859fl_91a6ad3b-58ad-4095-97df-be878b439ac6/operator/0.log" Jan 22 09:20:16 crc kubenswrapper[4681]: I0122 09:20:16.794319 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_7790a75e-2dfd-4f34-8327-ca553ec25c44/qdr/0.log" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.169169 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-grmqt/must-gather-jm9c2"] Jan 22 09:20:51 crc kubenswrapper[4681]: E0122 09:20:51.170625 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-collectd" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170647 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-collectd" Jan 22 09:20:51 crc kubenswrapper[4681]: E0122 09:20:51.170689 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-ceilometer" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170699 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-ceilometer" Jan 22 09:20:51 crc kubenswrapper[4681]: E0122 09:20:51.170716 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420a963d-c484-47e8-8f00-06ecb15e323e" containerName="curl" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170725 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="420a963d-c484-47e8-8f00-06ecb15e323e" containerName="curl" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170909 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="420a963d-c484-47e8-8f00-06ecb15e323e" containerName="curl" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170924 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-ceilometer" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.170949 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="042bf4f4-849a-41a1-8bc4-650995af5d74" containerName="smoketest-collectd" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.172022 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.173809 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-grmqt"/"default-dockercfg-k8msv" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.179376 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-grmqt"/"openshift-service-ca.crt" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.185035 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2cac7025-7732-4a3f-adba-6de5be54c867-must-gather-output\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.185102 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcnct\" (UniqueName: \"kubernetes.io/projected/2cac7025-7732-4a3f-adba-6de5be54c867-kube-api-access-hcnct\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.188490 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-grmqt"/"kube-root-ca.crt" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.193751 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-grmqt/must-gather-jm9c2"] Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.286432 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2cac7025-7732-4a3f-adba-6de5be54c867-must-gather-output\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.286743 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcnct\" (UniqueName: \"kubernetes.io/projected/2cac7025-7732-4a3f-adba-6de5be54c867-kube-api-access-hcnct\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.286874 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2cac7025-7732-4a3f-adba-6de5be54c867-must-gather-output\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.315254 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcnct\" (UniqueName: \"kubernetes.io/projected/2cac7025-7732-4a3f-adba-6de5be54c867-kube-api-access-hcnct\") pod \"must-gather-jm9c2\" (UID: \"2cac7025-7732-4a3f-adba-6de5be54c867\") " pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.491015 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-grmqt/must-gather-jm9c2" Jan 22 09:20:51 crc kubenswrapper[4681]: I0122 09:20:51.927876 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-grmqt/must-gather-jm9c2"] Jan 22 09:20:52 crc kubenswrapper[4681]: I0122 09:20:52.450520 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-grmqt/must-gather-jm9c2" event={"ID":"2cac7025-7732-4a3f-adba-6de5be54c867","Type":"ContainerStarted","Data":"112d798515fe99a0317e337d926a3a4bad59e313803980af34e834c0178e9ad2"} Jan 22 09:21:00 crc kubenswrapper[4681]: I0122 09:21:00.535337 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-grmqt/must-gather-jm9c2" event={"ID":"2cac7025-7732-4a3f-adba-6de5be54c867","Type":"ContainerStarted","Data":"50dad2425ca54f31100099aafd60ed4cb99168a304a097ea7e7e7e3c6a0dad2f"} Jan 22 09:21:00 crc kubenswrapper[4681]: I0122 09:21:00.536044 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-grmqt/must-gather-jm9c2" event={"ID":"2cac7025-7732-4a3f-adba-6de5be54c867","Type":"ContainerStarted","Data":"70f8f397320840d8a09fde6f47e3d10f141556b8722928eed138d390667c87a1"} Jan 22 09:21:00 crc kubenswrapper[4681]: I0122 09:21:00.562850 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-grmqt/must-gather-jm9c2" podStartSLOduration=1.843987337 podStartE2EDuration="9.562822229s" podCreationTimestamp="2026-01-22 09:20:51 +0000 UTC" firstStartedPulling="2026-01-22 09:20:51.939078464 +0000 UTC m=+1042.764988969" lastFinishedPulling="2026-01-22 09:20:59.657913366 +0000 UTC m=+1050.483823861" observedRunningTime="2026-01-22 09:21:00.553186491 +0000 UTC m=+1051.379096996" watchObservedRunningTime="2026-01-22 09:21:00.562822229 +0000 UTC m=+1051.388732764" Jan 22 09:21:13 crc kubenswrapper[4681]: I0122 09:21:13.997951 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gfd8p_74aff32c-9835-440b-9961-5fbcada6c96b/control-plane-machine-set-operator/0.log" Jan 22 09:21:14 crc kubenswrapper[4681]: I0122 09:21:14.016117 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtcwj_5d28f6be-a6f5-4605-b071-ec453a08a7d7/kube-rbac-proxy/0.log" Jan 22 09:21:14 crc kubenswrapper[4681]: I0122 09:21:14.027946 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtcwj_5d28f6be-a6f5-4605-b071-ec453a08a7d7/machine-api-operator/0.log" Jan 22 09:21:20 crc kubenswrapper[4681]: I0122 09:21:20.082823 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-kfsd2_f293d0c8-2cee-4754-9edc-67524679619a/cert-manager-controller/0.log" Jan 22 09:21:20 crc kubenswrapper[4681]: I0122 09:21:20.093019 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-p7hrl_2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2/cert-manager-cainjector/0.log" Jan 22 09:21:20 crc kubenswrapper[4681]: I0122 09:21:20.101601 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-fvp42_9d8dafc2-baca-45a7-bd2a-731f4011097e/cert-manager-webhook/0.log" Jan 22 09:21:25 crc kubenswrapper[4681]: I0122 09:21:25.967478 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6zkmp_a1a87f01-0828-4b50-9567-3e88120e3de6/prometheus-operator/0.log" Jan 22 09:21:25 crc kubenswrapper[4681]: I0122 09:21:25.980325 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp_91e0d48d-41f3-469c-8743-00b4ee3cdc94/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:25 crc kubenswrapper[4681]: I0122 09:21:25.995026 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-zm777_baa6f894-0e43-4f0a-ba66-a9dd75edd31f/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:26 crc kubenswrapper[4681]: I0122 09:21:26.012707 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8ltql_1928d100-6670-42d8-898f-6102dfbfee50/operator/0.log" Jan 22 09:21:26 crc kubenswrapper[4681]: I0122 09:21:26.029294 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-xx7ql_fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a/perses-operator/0.log" Jan 22 09:21:26 crc kubenswrapper[4681]: I0122 09:21:26.030845 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:21:26 crc kubenswrapper[4681]: I0122 09:21:26.030883 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.561727 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2_3e7ae8ed-882c-4537-9699-344ae1d6fa06/extract/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.568876 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2_3e7ae8ed-882c-4537-9699-344ae1d6fa06/util/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.610496 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931apnhz2_3e7ae8ed-882c-4537-9699-344ae1d6fa06/pull/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.625014 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm_11634169-0b47-4aa5-90d3-6038111be8f6/extract/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.631686 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm_11634169-0b47-4aa5-90d3-6038111be8f6/util/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.641667 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8f4ldxm_11634169-0b47-4aa5-90d3-6038111be8f6/pull/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.657668 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g_bab04fd7-466e-4cca-9b7e-55e45896f903/extract/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.670554 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g_bab04fd7-466e-4cca-9b7e-55e45896f903/util/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.691414 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5exm25g_bab04fd7-466e-4cca-9b7e-55e45896f903/pull/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.703785 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv_b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c/extract/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.709357 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv_b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c/util/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.715953 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08r5skv_b1a0c173-d88f-42a7-8e5a-c8bf03be0c7c/pull/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.939257 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7jtrb_94bafc7c-e628-4827-b1b7-f016d562bd9f/registry-server/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.943497 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7jtrb_94bafc7c-e628-4827-b1b7-f016d562bd9f/extract-utilities/0.log" Jan 22 09:21:31 crc kubenswrapper[4681]: I0122 09:21:31.950170 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7jtrb_94bafc7c-e628-4827-b1b7-f016d562bd9f/extract-content/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.148318 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rbqlg_b7828afb-82af-4b1c-a8f8-900963d42fd1/registry-server/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.154149 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rbqlg_b7828afb-82af-4b1c-a8f8-900963d42fd1/extract-utilities/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.159324 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rbqlg_b7828afb-82af-4b1c-a8f8-900963d42fd1/extract-content/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.176478 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zf5vt_45a25363-c0b4-4fdb-a773-fc99c6653bbb/marketplace-operator/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.355610 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8lt9z_106c3866-7eec-46b8-bf93-cd3edad7c59c/registry-server/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.361521 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8lt9z_106c3866-7eec-46b8-bf93-cd3edad7c59c/extract-utilities/0.log" Jan 22 09:21:32 crc kubenswrapper[4681]: I0122 09:21:32.368569 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8lt9z_106c3866-7eec-46b8-bf93-cd3edad7c59c/extract-content/0.log" Jan 22 09:21:37 crc kubenswrapper[4681]: I0122 09:21:37.122974 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6zkmp_a1a87f01-0828-4b50-9567-3e88120e3de6/prometheus-operator/0.log" Jan 22 09:21:37 crc kubenswrapper[4681]: I0122 09:21:37.142726 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp_91e0d48d-41f3-469c-8743-00b4ee3cdc94/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:37 crc kubenswrapper[4681]: I0122 09:21:37.156074 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-zm777_baa6f894-0e43-4f0a-ba66-a9dd75edd31f/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:37 crc kubenswrapper[4681]: I0122 09:21:37.188650 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8ltql_1928d100-6670-42d8-898f-6102dfbfee50/operator/0.log" Jan 22 09:21:37 crc kubenswrapper[4681]: I0122 09:21:37.203316 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-xx7ql_fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a/perses-operator/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.317529 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-6zkmp_a1a87f01-0828-4b50-9567-3e88120e3de6/prometheus-operator/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.342300 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-c2bbp_91e0d48d-41f3-469c-8743-00b4ee3cdc94/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.349848 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-547cd868c5-zm777_baa6f894-0e43-4f0a-ba66-a9dd75edd31f/prometheus-operator-admission-webhook/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.364175 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8ltql_1928d100-6670-42d8-898f-6102dfbfee50/operator/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.378859 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-xx7ql_fab6bf9a-3bb4-4bc8-a80e-1cd0ab9d6d2a/perses-operator/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.449799 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-kfsd2_f293d0c8-2cee-4754-9edc-67524679619a/cert-manager-controller/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.463153 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-p7hrl_2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2/cert-manager-cainjector/0.log" Jan 22 09:21:47 crc kubenswrapper[4681]: I0122 09:21:47.476765 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-fvp42_9d8dafc2-baca-45a7-bd2a-731f4011097e/cert-manager-webhook/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.052198 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-kfsd2_f293d0c8-2cee-4754-9edc-67524679619a/cert-manager-controller/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.065511 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-p7hrl_2177d3cd-4719-4db5-b8fc-cbfa8e83ddf2/cert-manager-cainjector/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.083563 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-fvp42_9d8dafc2-baca-45a7-bd2a-731f4011097e/cert-manager-webhook/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.490161 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gfd8p_74aff32c-9835-440b-9961-5fbcada6c96b/control-plane-machine-set-operator/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.502843 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtcwj_5d28f6be-a6f5-4605-b071-ec453a08a7d7/kube-rbac-proxy/0.log" Jan 22 09:21:48 crc kubenswrapper[4681]: I0122 09:21:48.510208 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mtcwj_5d28f6be-a6f5-4605-b071-ec453a08a7d7/machine-api-operator/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.078454 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd_689f04b5-a34d-4a7a-ad94-5ec5e77a8371/extract/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.088654 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd_689f04b5-a34d-4a7a-ad94-5ec5e77a8371/util/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.095253 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_27e4a3b82b847aaaac340f98fd9ec51c99f28242b589c6c251a26fbc3b6pwwd_689f04b5-a34d-4a7a-ad94-5ec5e77a8371/pull/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.106970 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_ef78e840-4520-4de1-8abe-af82e052bfa3/alertmanager/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.114285 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_ef78e840-4520-4de1-8abe-af82e052bfa3/config-reloader/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.121637 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_ef78e840-4520-4de1-8abe-af82e052bfa3/oauth-proxy/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.126939 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_ef78e840-4520-4de1-8abe-af82e052bfa3/init-config-reloader/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.137111 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j_910fbf37-770d-44c7-812d-805b7097b592/extract/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.144027 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j_910fbf37-770d-44c7-812d-805b7097b592/util/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.152403 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_cd07ebce3b618aaffa8e106dab3e8eb93287fbb0e4c5a9c0f6ea8fc7ebkwn8j_910fbf37-770d-44c7-812d-805b7097b592/pull/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.223774 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_420a963d-c484-47e8-8f00-06ecb15e323e/curl/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.232959 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b_b152282d-9a3b-41c0-bb88-36de5b273303/bridge/1.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.233172 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b_b152282d-9a3b-41c0-bb88-36de5b273303/bridge/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.237011 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7fd54dfd8f-f2z7b_b152282d-9a3b-41c0-bb88-36de5b273303/sg-core/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.246950 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_447b910c-7755-4492-87ef-fa6a55ee9698/oauth-proxy/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.255070 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_447b910c-7755-4492-87ef-fa6a55ee9698/bridge/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.266578 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-p24nn_447b910c-7755-4492-87ef-fa6a55ee9698/sg-core/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.276644 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8_aa7f89fd-9692-40d1-85a5-54936ab44840/bridge/1.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.277518 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8_aa7f89fd-9692-40d1-85a5-54936ab44840/bridge/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.282774 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7b6cdc7bcf-vp6s8_aa7f89fd-9692-40d1-85a5-54936ab44840/sg-core/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.292657 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_f38c0f28-9c07-459b-89e4-0aa2c7847262/oauth-proxy/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.300534 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_f38c0f28-9c07-459b-89e4-0aa2c7847262/bridge/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.308686 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-52gpr_f38c0f28-9c07-459b-89e4-0aa2c7847262/sg-core/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.319169 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/oauth-proxy/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.325710 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/bridge/1.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.326203 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/bridge/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.330715 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-4rt7x_f4d025fe-ae0e-48a3-bdba-35983685d558/sg-core/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.348326 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-rsshc_76112dc3-3650-4767-ae53-7f11a6baac67/default-interconnect/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.356789 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-dh6q6_3f65bf37-6f6e-4117-b952-d8d859f01094/prometheus-webhook-snmp/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.387135 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elastic-operator-577ff7768c-mpfxh_c72b89e7-f149-4a53-b660-54ca9f4cf900/manager/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.409795 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_44b13aea-bee1-4576-87b4-d41165fcc2fa/elasticsearch/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.416747 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_44b13aea-bee1-4576-87b4-d41165fcc2fa/elastic-internal-init-filesystem/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.422305 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_44b13aea-bee1-4576-87b4-d41165fcc2fa/elastic-internal-suspend/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.434814 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_interconnect-operator-5bb49f789d-z6l9z_6d9c13a0-8fe9-486f-a6f5-0bdec2ab2686/interconnect-operator/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.449292 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_187498e5-8d96-4912-8ad6-5a87ddca4a88/prometheus/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.468992 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_187498e5-8d96-4912-8ad6-5a87ddca4a88/config-reloader/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.477705 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_187498e5-8d96-4912-8ad6-5a87ddca4a88/oauth-proxy/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.485859 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_187498e5-8d96-4912-8ad6-5a87ddca4a88/init-config-reloader/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.499241 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_7790a75e-2dfd-4f34-8327-ca553ec25c44/qdr/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.516311 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_1e6f6f66-599e-48a5-b4b3-e9eaf71500e6/docker-build/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.524967 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_1e6f6f66-599e-48a5-b4b3-e9eaf71500e6/git-clone/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.531778 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-2-build_1e6f6f66-599e-48a5-b4b3-e9eaf71500e6/manage-dockerfile/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.546034 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-operators-b9xkl_c1271182-7432-477c-9e01-0b0b6d850377/registry-server/0.log" Jan 22 09:21:49 crc kubenswrapper[4681]: I0122 09:21:49.714870 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-55b89ddfb9-748rx_3f0e9240-f934-448c-8027-b5d44f6ca38c/operator/0.log" Jan 22 09:21:51 crc kubenswrapper[4681]: I0122 09:21:51.627873 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bbbc889bc-859fl_91a6ad3b-58ad-4095-97df-be878b439ac6/operator/0.log" Jan 22 09:21:51 crc kubenswrapper[4681]: I0122 09:21:51.647591 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-8tmfj_042bf4f4-849a-41a1-8bc4-650995af5d74/smoketest-collectd/0.log" Jan 22 09:21:51 crc kubenswrapper[4681]: I0122 09:21:51.654018 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-8tmfj_042bf4f4-849a-41a1-8bc4-650995af5d74/smoketest-ceilometer/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.072436 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/kube-multus-additional-cni-plugins/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.080812 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/egress-router-binary-copy/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.089530 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/cni-plugins/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.099840 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/bond-cni-plugin/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.110041 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/routeoverride-cni/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.119351 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/whereabouts-cni-bincopy/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.129591 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-knrhw_c85970b8-70b4-44fc-a45d-8409cf53d709/whereabouts-cni/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.143058 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-6p99f_4db34d31-e1cc-4afd-afb4-b0a5a535053d/multus-admission-controller/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.153258 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-6p99f_4db34d31-e1cc-4afd-afb4-b0a5a535053d/kube-rbac-proxy/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.184390 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xpdjl_1976858f-1664-4b36-9929-65cc8fe9d0ad/kube-multus/1.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.224542 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xpdjl_1976858f-1664-4b36-9929-65cc8fe9d0ad/kube-multus/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.253222 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjf2g_2e7e003a-24ec-4f48-a156-a5ed6a3afd03/network-metrics-daemon/0.log" Jan 22 09:21:53 crc kubenswrapper[4681]: I0122 09:21:53.259047 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjf2g_2e7e003a-24ec-4f48-a156-a5ed6a3afd03/kube-rbac-proxy/0.log" Jan 22 09:21:56 crc kubenswrapper[4681]: I0122 09:21:56.031091 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:21:56 crc kubenswrapper[4681]: I0122 09:21:56.031480 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:22:26 crc kubenswrapper[4681]: I0122 09:22:26.031291 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:22:26 crc kubenswrapper[4681]: I0122 09:22:26.031906 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:22:26 crc kubenswrapper[4681]: I0122 09:22:26.031969 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:22:26 crc kubenswrapper[4681]: I0122 09:22:26.234363 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:22:26 crc kubenswrapper[4681]: I0122 09:22:26.234452 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b" gracePeriod=600 Jan 22 09:22:27 crc kubenswrapper[4681]: I0122 09:22:27.246739 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b" exitCode=0 Jan 22 09:22:27 crc kubenswrapper[4681]: I0122 09:22:27.246915 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b"} Jan 22 09:22:27 crc kubenswrapper[4681]: I0122 09:22:27.247686 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea"} Jan 22 09:22:27 crc kubenswrapper[4681]: I0122 09:22:27.247724 4681 scope.go:117] "RemoveContainer" containerID="f2e24b305b35b3a09fbd338e564438cca6ba09e567d5c64f883d53c47948b3c4" Jan 22 09:23:11 crc kubenswrapper[4681]: I0122 09:23:11.986487 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:11 crc kubenswrapper[4681]: I0122 09:23:11.988016 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.006115 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.101741 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcvv\" (UniqueName: \"kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv\") pod \"service-telemetry-framework-operators-pss6b\" (UID: \"b3758235-43da-44a7-b8cf-8b056770f095\") " pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.204125 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcvv\" (UniqueName: \"kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv\") pod \"service-telemetry-framework-operators-pss6b\" (UID: \"b3758235-43da-44a7-b8cf-8b056770f095\") " pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.246621 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcvv\" (UniqueName: \"kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv\") pod \"service-telemetry-framework-operators-pss6b\" (UID: \"b3758235-43da-44a7-b8cf-8b056770f095\") " pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.309794 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.623628 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:12 crc kubenswrapper[4681]: W0122 09:23:12.630889 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3758235_43da_44a7_b8cf_8b056770f095.slice/crio-02adfed5d6f73f69ee606d8e28a600c2f78cf6eac3161f96b75f8e398e635943 WatchSource:0}: Error finding container 02adfed5d6f73f69ee606d8e28a600c2f78cf6eac3161f96b75f8e398e635943: Status 404 returned error can't find the container with id 02adfed5d6f73f69ee606d8e28a600c2f78cf6eac3161f96b75f8e398e635943 Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.633315 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:23:12 crc kubenswrapper[4681]: I0122 09:23:12.702692 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-pss6b" event={"ID":"b3758235-43da-44a7-b8cf-8b056770f095","Type":"ContainerStarted","Data":"02adfed5d6f73f69ee606d8e28a600c2f78cf6eac3161f96b75f8e398e635943"} Jan 22 09:23:13 crc kubenswrapper[4681]: I0122 09:23:13.711471 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-pss6b" event={"ID":"b3758235-43da-44a7-b8cf-8b056770f095","Type":"ContainerStarted","Data":"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3"} Jan 22 09:23:13 crc kubenswrapper[4681]: I0122 09:23:13.737056 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-pss6b" podStartSLOduration=2.62252224 podStartE2EDuration="2.737028655s" podCreationTimestamp="2026-01-22 09:23:11 +0000 UTC" firstStartedPulling="2026-01-22 09:23:12.632996942 +0000 UTC m=+1183.458907457" lastFinishedPulling="2026-01-22 09:23:12.747503357 +0000 UTC m=+1183.573413872" observedRunningTime="2026-01-22 09:23:13.733801939 +0000 UTC m=+1184.559712484" watchObservedRunningTime="2026-01-22 09:23:13.737028655 +0000 UTC m=+1184.562939200" Jan 22 09:23:22 crc kubenswrapper[4681]: I0122 09:23:22.310651 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:22 crc kubenswrapper[4681]: I0122 09:23:22.311104 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:22 crc kubenswrapper[4681]: I0122 09:23:22.344870 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:23 crc kubenswrapper[4681]: I0122 09:23:23.042101 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:23 crc kubenswrapper[4681]: I0122 09:23:23.581171 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:25 crc kubenswrapper[4681]: I0122 09:23:25.017085 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-pss6b" podUID="b3758235-43da-44a7-b8cf-8b056770f095" containerName="registry-server" containerID="cri-o://69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3" gracePeriod=2 Jan 22 09:23:25 crc kubenswrapper[4681]: I0122 09:23:25.430473 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:25 crc kubenswrapper[4681]: I0122 09:23:25.618471 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pcvv\" (UniqueName: \"kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv\") pod \"b3758235-43da-44a7-b8cf-8b056770f095\" (UID: \"b3758235-43da-44a7-b8cf-8b056770f095\") " Jan 22 09:23:25 crc kubenswrapper[4681]: I0122 09:23:25.626836 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv" (OuterVolumeSpecName: "kube-api-access-6pcvv") pod "b3758235-43da-44a7-b8cf-8b056770f095" (UID: "b3758235-43da-44a7-b8cf-8b056770f095"). InnerVolumeSpecName "kube-api-access-6pcvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:23:25 crc kubenswrapper[4681]: I0122 09:23:25.720818 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pcvv\" (UniqueName: \"kubernetes.io/projected/b3758235-43da-44a7-b8cf-8b056770f095-kube-api-access-6pcvv\") on node \"crc\" DevicePath \"\"" Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.031609 4681 generic.go:334] "Generic (PLEG): container finished" podID="b3758235-43da-44a7-b8cf-8b056770f095" containerID="69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3" exitCode=0 Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.031664 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-pss6b" event={"ID":"b3758235-43da-44a7-b8cf-8b056770f095","Type":"ContainerDied","Data":"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3"} Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.031702 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-pss6b" event={"ID":"b3758235-43da-44a7-b8cf-8b056770f095","Type":"ContainerDied","Data":"02adfed5d6f73f69ee606d8e28a600c2f78cf6eac3161f96b75f8e398e635943"} Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.031730 4681 scope.go:117] "RemoveContainer" containerID="69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3" Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.031867 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-pss6b" Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.073271 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.078538 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-pss6b"] Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.091082 4681 scope.go:117] "RemoveContainer" containerID="69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3" Jan 22 09:23:26 crc kubenswrapper[4681]: E0122 09:23:26.091612 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3\": container with ID starting with 69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3 not found: ID does not exist" containerID="69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3" Jan 22 09:23:26 crc kubenswrapper[4681]: I0122 09:23:26.091642 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3"} err="failed to get container status \"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3\": rpc error: code = NotFound desc = could not find container \"69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3\": container with ID starting with 69566da60df3752eb2c92f66b5b33f6b1312b35c84f34065f1953e6cc19de4d3 not found: ID does not exist" Jan 22 09:23:27 crc kubenswrapper[4681]: I0122 09:23:27.463028 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3758235-43da-44a7-b8cf-8b056770f095" path="/var/lib/kubelet/pods/b3758235-43da-44a7-b8cf-8b056770f095/volumes" Jan 22 09:24:26 crc kubenswrapper[4681]: I0122 09:24:26.031346 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:24:26 crc kubenswrapper[4681]: I0122 09:24:26.032175 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:24:56 crc kubenswrapper[4681]: I0122 09:24:56.039589 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:24:56 crc kubenswrapper[4681]: I0122 09:24:56.040796 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:25:26 crc kubenswrapper[4681]: I0122 09:25:26.032868 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:25:26 crc kubenswrapper[4681]: I0122 09:25:26.033437 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:25:26 crc kubenswrapper[4681]: I0122 09:25:26.033484 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:25:26 crc kubenswrapper[4681]: I0122 09:25:26.033965 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:25:26 crc kubenswrapper[4681]: I0122 09:25:26.034018 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea" gracePeriod=600 Jan 22 09:25:28 crc kubenswrapper[4681]: I0122 09:25:28.211782 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea" exitCode=0 Jan 22 09:25:28 crc kubenswrapper[4681]: I0122 09:25:28.211889 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea"} Jan 22 09:25:28 crc kubenswrapper[4681]: I0122 09:25:28.212483 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e"} Jan 22 09:25:28 crc kubenswrapper[4681]: I0122 09:25:28.212525 4681 scope.go:117] "RemoveContainer" containerID="11b1c8cd564bf666acd3695c5c8a75cd8b968965a473b5788b1d1bc38ac5668b" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.659510 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:26:41 crc kubenswrapper[4681]: E0122 09:26:41.661002 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3758235-43da-44a7-b8cf-8b056770f095" containerName="registry-server" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.661020 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3758235-43da-44a7-b8cf-8b056770f095" containerName="registry-server" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.661187 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3758235-43da-44a7-b8cf-8b056770f095" containerName="registry-server" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.662223 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.681880 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.773601 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.774547 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xhb5\" (UniqueName: \"kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.774680 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.876028 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xhb5\" (UniqueName: \"kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.876085 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.876168 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.876814 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.876854 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:41 crc kubenswrapper[4681]: I0122 09:26:41.897352 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xhb5\" (UniqueName: \"kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5\") pod \"redhat-operators-fpkx2\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:42 crc kubenswrapper[4681]: I0122 09:26:42.029588 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:42 crc kubenswrapper[4681]: I0122 09:26:42.495235 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:26:42 crc kubenswrapper[4681]: I0122 09:26:42.937277 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerStarted","Data":"df7728bed01846b90c94a7baec84ac3bbf4e504ecf583ca61a7418bc4fd780e0"} Jan 22 09:26:42 crc kubenswrapper[4681]: I0122 09:26:42.937325 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerStarted","Data":"dba4ec7baa35f161e7923c646f0c2fcae9a179ba8ff2d3ae17f41838b7a8e26e"} Jan 22 09:26:43 crc kubenswrapper[4681]: I0122 09:26:43.950871 4681 generic.go:334] "Generic (PLEG): container finished" podID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerID="df7728bed01846b90c94a7baec84ac3bbf4e504ecf583ca61a7418bc4fd780e0" exitCode=0 Jan 22 09:26:43 crc kubenswrapper[4681]: I0122 09:26:43.950973 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerDied","Data":"df7728bed01846b90c94a7baec84ac3bbf4e504ecf583ca61a7418bc4fd780e0"} Jan 22 09:26:45 crc kubenswrapper[4681]: I0122 09:26:45.968749 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerStarted","Data":"f63d1faab224b5825ea5dc680f0dbd3c21bb15e51234078bdd0ac4196f4dc407"} Jan 22 09:26:46 crc kubenswrapper[4681]: I0122 09:26:46.980173 4681 generic.go:334] "Generic (PLEG): container finished" podID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerID="f63d1faab224b5825ea5dc680f0dbd3c21bb15e51234078bdd0ac4196f4dc407" exitCode=0 Jan 22 09:26:46 crc kubenswrapper[4681]: I0122 09:26:46.980239 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerDied","Data":"f63d1faab224b5825ea5dc680f0dbd3c21bb15e51234078bdd0ac4196f4dc407"} Jan 22 09:26:47 crc kubenswrapper[4681]: I0122 09:26:47.990514 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerStarted","Data":"085c66da8876865aa8563254184d62c390ed3d499a9d75bd96ad6ece391296dd"} Jan 22 09:26:48 crc kubenswrapper[4681]: I0122 09:26:48.015485 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fpkx2" podStartSLOduration=3.481757298 podStartE2EDuration="7.015458416s" podCreationTimestamp="2026-01-22 09:26:41 +0000 UTC" firstStartedPulling="2026-01-22 09:26:43.956410971 +0000 UTC m=+1394.782321526" lastFinishedPulling="2026-01-22 09:26:47.490112099 +0000 UTC m=+1398.316022644" observedRunningTime="2026-01-22 09:26:48.013501094 +0000 UTC m=+1398.839411609" watchObservedRunningTime="2026-01-22 09:26:48.015458416 +0000 UTC m=+1398.841368961" Jan 22 09:26:52 crc kubenswrapper[4681]: I0122 09:26:52.029995 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:52 crc kubenswrapper[4681]: I0122 09:26:52.030462 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:26:53 crc kubenswrapper[4681]: I0122 09:26:53.072954 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fpkx2" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="registry-server" probeResult="failure" output=< Jan 22 09:26:53 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:26:53 crc kubenswrapper[4681]: > Jan 22 09:27:02 crc kubenswrapper[4681]: I0122 09:27:02.096082 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:27:02 crc kubenswrapper[4681]: I0122 09:27:02.171182 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:27:02 crc kubenswrapper[4681]: I0122 09:27:02.340653 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:27:03 crc kubenswrapper[4681]: I0122 09:27:03.114678 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fpkx2" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="registry-server" containerID="cri-o://085c66da8876865aa8563254184d62c390ed3d499a9d75bd96ad6ece391296dd" gracePeriod=2 Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.139824 4681 generic.go:334] "Generic (PLEG): container finished" podID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerID="085c66da8876865aa8563254184d62c390ed3d499a9d75bd96ad6ece391296dd" exitCode=0 Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.140193 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerDied","Data":"085c66da8876865aa8563254184d62c390ed3d499a9d75bd96ad6ece391296dd"} Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.213310 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.379365 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xhb5\" (UniqueName: \"kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5\") pod \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.379585 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities\") pod \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.379740 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content\") pod \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\" (UID: \"7ec8e497-2f22-4108-8b5e-e6099ce1962f\") " Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.380578 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities" (OuterVolumeSpecName: "utilities") pod "7ec8e497-2f22-4108-8b5e-e6099ce1962f" (UID: "7ec8e497-2f22-4108-8b5e-e6099ce1962f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.388669 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5" (OuterVolumeSpecName: "kube-api-access-9xhb5") pod "7ec8e497-2f22-4108-8b5e-e6099ce1962f" (UID: "7ec8e497-2f22-4108-8b5e-e6099ce1962f"). InnerVolumeSpecName "kube-api-access-9xhb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.482457 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xhb5\" (UniqueName: \"kubernetes.io/projected/7ec8e497-2f22-4108-8b5e-e6099ce1962f-kube-api-access-9xhb5\") on node \"crc\" DevicePath \"\"" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.482510 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.510430 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ec8e497-2f22-4108-8b5e-e6099ce1962f" (UID: "7ec8e497-2f22-4108-8b5e-e6099ce1962f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:27:06 crc kubenswrapper[4681]: I0122 09:27:06.585200 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ec8e497-2f22-4108-8b5e-e6099ce1962f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.157448 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpkx2" event={"ID":"7ec8e497-2f22-4108-8b5e-e6099ce1962f","Type":"ContainerDied","Data":"dba4ec7baa35f161e7923c646f0c2fcae9a179ba8ff2d3ae17f41838b7a8e26e"} Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.157507 4681 scope.go:117] "RemoveContainer" containerID="085c66da8876865aa8563254184d62c390ed3d499a9d75bd96ad6ece391296dd" Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.157552 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpkx2" Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.179233 4681 scope.go:117] "RemoveContainer" containerID="f63d1faab224b5825ea5dc680f0dbd3c21bb15e51234078bdd0ac4196f4dc407" Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.211941 4681 scope.go:117] "RemoveContainer" containerID="df7728bed01846b90c94a7baec84ac3bbf4e504ecf583ca61a7418bc4fd780e0" Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.239902 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.245628 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fpkx2"] Jan 22 09:27:07 crc kubenswrapper[4681]: I0122 09:27:07.469652 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" path="/var/lib/kubelet/pods/7ec8e497-2f22-4108-8b5e-e6099ce1962f/volumes" Jan 22 09:27:56 crc kubenswrapper[4681]: I0122 09:27:56.031360 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:27:56 crc kubenswrapper[4681]: I0122 09:27:56.032030 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:28:26 crc kubenswrapper[4681]: I0122 09:28:26.031547 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:28:26 crc kubenswrapper[4681]: I0122 09:28:26.032138 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.786921 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:28:45 crc kubenswrapper[4681]: E0122 09:28:45.790288 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="registry-server" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.790312 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="registry-server" Jan 22 09:28:45 crc kubenswrapper[4681]: E0122 09:28:45.790351 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="extract-content" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.790363 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="extract-content" Jan 22 09:28:45 crc kubenswrapper[4681]: E0122 09:28:45.790380 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="extract-utilities" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.790394 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="extract-utilities" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.790605 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec8e497-2f22-4108-8b5e-e6099ce1962f" containerName="registry-server" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.791409 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.799722 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.850647 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sclk9\" (UniqueName: \"kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9\") pod \"service-telemetry-framework-operators-zckzc\" (UID: \"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8\") " pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.952843 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sclk9\" (UniqueName: \"kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9\") pod \"service-telemetry-framework-operators-zckzc\" (UID: \"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8\") " pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:45 crc kubenswrapper[4681]: I0122 09:28:45.982575 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sclk9\" (UniqueName: \"kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9\") pod \"service-telemetry-framework-operators-zckzc\" (UID: \"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8\") " pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:46 crc kubenswrapper[4681]: I0122 09:28:46.119129 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:46 crc kubenswrapper[4681]: I0122 09:28:46.374142 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:28:46 crc kubenswrapper[4681]: W0122 09:28:46.375712 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb52da0f2_768a_4f0c_8c2c_c06ffbb3a1b8.slice/crio-763134e2a8ae74996a7d9b91a6f3a9f1dc1d10483b035db4a87ad8c4e9ab3b55 WatchSource:0}: Error finding container 763134e2a8ae74996a7d9b91a6f3a9f1dc1d10483b035db4a87ad8c4e9ab3b55: Status 404 returned error can't find the container with id 763134e2a8ae74996a7d9b91a6f3a9f1dc1d10483b035db4a87ad8c4e9ab3b55 Jan 22 09:28:46 crc kubenswrapper[4681]: I0122 09:28:46.379429 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:28:47 crc kubenswrapper[4681]: I0122 09:28:47.011633 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-zckzc" event={"ID":"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8","Type":"ContainerStarted","Data":"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168"} Jan 22 09:28:47 crc kubenswrapper[4681]: I0122 09:28:47.012096 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-zckzc" event={"ID":"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8","Type":"ContainerStarted","Data":"763134e2a8ae74996a7d9b91a6f3a9f1dc1d10483b035db4a87ad8c4e9ab3b55"} Jan 22 09:28:47 crc kubenswrapper[4681]: I0122 09:28:47.070900 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-zckzc" podStartSLOduration=1.934146831 podStartE2EDuration="2.070866293s" podCreationTimestamp="2026-01-22 09:28:45 +0000 UTC" firstStartedPulling="2026-01-22 09:28:46.378727478 +0000 UTC m=+1517.204638013" lastFinishedPulling="2026-01-22 09:28:46.51544696 +0000 UTC m=+1517.341357475" observedRunningTime="2026-01-22 09:28:47.062032028 +0000 UTC m=+1517.887942553" watchObservedRunningTime="2026-01-22 09:28:47.070866293 +0000 UTC m=+1517.896776808" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.031663 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.032314 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.032381 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.033296 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.033389 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" gracePeriod=600 Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.119609 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.120627 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:56 crc kubenswrapper[4681]: I0122 09:28:56.161320 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:56 crc kubenswrapper[4681]: E0122 09:28:56.666921 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.119225 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" exitCode=0 Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.119448 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e"} Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.120833 4681 scope.go:117] "RemoveContainer" containerID="e28fe2c62bd4e388fc876768b83792c99b67b618087c90e219797afad38b74ea" Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.121707 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:28:57 crc kubenswrapper[4681]: E0122 09:28:57.122021 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.186050 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:57 crc kubenswrapper[4681]: I0122 09:28:57.241867 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:28:59 crc kubenswrapper[4681]: I0122 09:28:59.142139 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-zckzc" podUID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" containerName="registry-server" containerID="cri-o://67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168" gracePeriod=2 Jan 22 09:28:59 crc kubenswrapper[4681]: I0122 09:28:59.584297 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:28:59 crc kubenswrapper[4681]: I0122 09:28:59.706649 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sclk9\" (UniqueName: \"kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9\") pod \"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8\" (UID: \"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8\") " Jan 22 09:28:59 crc kubenswrapper[4681]: I0122 09:28:59.715823 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9" (OuterVolumeSpecName: "kube-api-access-sclk9") pod "b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" (UID: "b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8"). InnerVolumeSpecName "kube-api-access-sclk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:28:59 crc kubenswrapper[4681]: I0122 09:28:59.808298 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sclk9\" (UniqueName: \"kubernetes.io/projected/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8-kube-api-access-sclk9\") on node \"crc\" DevicePath \"\"" Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.152661 4681 generic.go:334] "Generic (PLEG): container finished" podID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" containerID="67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168" exitCode=0 Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.152746 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-zckzc" Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.152745 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-zckzc" event={"ID":"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8","Type":"ContainerDied","Data":"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168"} Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.152861 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-zckzc" event={"ID":"b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8","Type":"ContainerDied","Data":"763134e2a8ae74996a7d9b91a6f3a9f1dc1d10483b035db4a87ad8c4e9ab3b55"} Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.152995 4681 scope.go:117] "RemoveContainer" containerID="67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168" Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.184072 4681 scope.go:117] "RemoveContainer" containerID="67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168" Jan 22 09:29:00 crc kubenswrapper[4681]: E0122 09:29:00.184962 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168\": container with ID starting with 67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168 not found: ID does not exist" containerID="67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168" Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.185013 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168"} err="failed to get container status \"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168\": rpc error: code = NotFound desc = could not find container \"67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168\": container with ID starting with 67db57191b282eedcc4a0361d913fbfacccb84a524b5aaf8d06a607804096168 not found: ID does not exist" Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.211152 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:29:00 crc kubenswrapper[4681]: I0122 09:29:00.227217 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-zckzc"] Jan 22 09:29:01 crc kubenswrapper[4681]: I0122 09:29:01.465579 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" path="/var/lib/kubelet/pods/b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8/volumes" Jan 22 09:29:12 crc kubenswrapper[4681]: I0122 09:29:12.453432 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:29:12 crc kubenswrapper[4681]: E0122 09:29:12.454399 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:29:25 crc kubenswrapper[4681]: I0122 09:29:25.453257 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:29:25 crc kubenswrapper[4681]: E0122 09:29:25.453857 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:29:36 crc kubenswrapper[4681]: I0122 09:29:36.453046 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:29:36 crc kubenswrapper[4681]: E0122 09:29:36.454373 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:29:47 crc kubenswrapper[4681]: I0122 09:29:47.453250 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:29:47 crc kubenswrapper[4681]: E0122 09:29:47.454201 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.151716 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r"] Jan 22 09:30:00 crc kubenswrapper[4681]: E0122 09:30:00.152474 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" containerName="registry-server" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.152495 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" containerName="registry-server" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.152701 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52da0f2-768a-4f0c-8c2c-c06ffbb3a1b8" containerName="registry-server" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.153435 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.159740 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r"] Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.160221 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.162386 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.190369 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.190456 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.190712 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct5t8\" (UniqueName: \"kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.291916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct5t8\" (UniqueName: \"kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.292057 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.292100 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.293128 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.312466 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.321674 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct5t8\" (UniqueName: \"kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8\") pod \"collect-profiles-29484570-mpf7r\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.473790 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:00 crc kubenswrapper[4681]: I0122 09:30:00.710742 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r"] Jan 22 09:30:00 crc kubenswrapper[4681]: W0122 09:30:00.741430 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7432284c_1fcf_4350_b488_a63f3737d3d7.slice/crio-029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622 WatchSource:0}: Error finding container 029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622: Status 404 returned error can't find the container with id 029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622 Jan 22 09:30:01 crc kubenswrapper[4681]: I0122 09:30:01.455173 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:30:01 crc kubenswrapper[4681]: E0122 09:30:01.455449 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:30:01 crc kubenswrapper[4681]: I0122 09:30:01.708842 4681 generic.go:334] "Generic (PLEG): container finished" podID="7432284c-1fcf-4350-b488-a63f3737d3d7" containerID="ce15e8c0026611efdef8f53d389099a25b5c6f7542484185fc0aafb2b4c84b45" exitCode=0 Jan 22 09:30:01 crc kubenswrapper[4681]: I0122 09:30:01.708897 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" event={"ID":"7432284c-1fcf-4350-b488-a63f3737d3d7","Type":"ContainerDied","Data":"ce15e8c0026611efdef8f53d389099a25b5c6f7542484185fc0aafb2b4c84b45"} Jan 22 09:30:01 crc kubenswrapper[4681]: I0122 09:30:01.708928 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" event={"ID":"7432284c-1fcf-4350-b488-a63f3737d3d7","Type":"ContainerStarted","Data":"029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622"} Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.076556 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.161993 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct5t8\" (UniqueName: \"kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8\") pod \"7432284c-1fcf-4350-b488-a63f3737d3d7\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.162235 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume\") pod \"7432284c-1fcf-4350-b488-a63f3737d3d7\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.162449 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume\") pod \"7432284c-1fcf-4350-b488-a63f3737d3d7\" (UID: \"7432284c-1fcf-4350-b488-a63f3737d3d7\") " Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.163439 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "7432284c-1fcf-4350-b488-a63f3737d3d7" (UID: "7432284c-1fcf-4350-b488-a63f3737d3d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.168732 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7432284c-1fcf-4350-b488-a63f3737d3d7" (UID: "7432284c-1fcf-4350-b488-a63f3737d3d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.169146 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8" (OuterVolumeSpecName: "kube-api-access-ct5t8") pod "7432284c-1fcf-4350-b488-a63f3737d3d7" (UID: "7432284c-1fcf-4350-b488-a63f3737d3d7"). InnerVolumeSpecName "kube-api-access-ct5t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.264519 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7432284c-1fcf-4350-b488-a63f3737d3d7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.264851 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7432284c-1fcf-4350-b488-a63f3737d3d7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.264941 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct5t8\" (UniqueName: \"kubernetes.io/projected/7432284c-1fcf-4350-b488-a63f3737d3d7-kube-api-access-ct5t8\") on node \"crc\" DevicePath \"\"" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.725211 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" event={"ID":"7432284c-1fcf-4350-b488-a63f3737d3d7","Type":"ContainerDied","Data":"029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622"} Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.725638 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="029aa1e5d040e51c7c62c191e89de8f60a2569f1bccc366278ad9ab61cde3622" Jan 22 09:30:03 crc kubenswrapper[4681]: I0122 09:30:03.725278 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484570-mpf7r" Jan 22 09:30:16 crc kubenswrapper[4681]: I0122 09:30:16.452904 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:30:16 crc kubenswrapper[4681]: E0122 09:30:16.453731 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:30:31 crc kubenswrapper[4681]: I0122 09:30:31.453569 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:30:31 crc kubenswrapper[4681]: E0122 09:30:31.454633 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:30:46 crc kubenswrapper[4681]: I0122 09:30:46.453045 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:30:46 crc kubenswrapper[4681]: E0122 09:30:46.454322 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:30:58 crc kubenswrapper[4681]: I0122 09:30:58.453535 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:30:58 crc kubenswrapper[4681]: E0122 09:30:58.454700 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:31:10 crc kubenswrapper[4681]: I0122 09:31:10.452160 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:31:10 crc kubenswrapper[4681]: E0122 09:31:10.454044 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:31:22 crc kubenswrapper[4681]: I0122 09:31:22.453545 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:31:22 crc kubenswrapper[4681]: E0122 09:31:22.454285 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:31:35 crc kubenswrapper[4681]: I0122 09:31:35.453088 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:31:35 crc kubenswrapper[4681]: E0122 09:31:35.453947 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:31:47 crc kubenswrapper[4681]: I0122 09:31:47.453868 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:31:47 crc kubenswrapper[4681]: E0122 09:31:47.455084 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:32:02 crc kubenswrapper[4681]: I0122 09:32:02.452564 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:32:02 crc kubenswrapper[4681]: E0122 09:32:02.453513 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:32:15 crc kubenswrapper[4681]: I0122 09:32:15.452720 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:32:15 crc kubenswrapper[4681]: E0122 09:32:15.453767 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:32:30 crc kubenswrapper[4681]: I0122 09:32:30.452665 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:32:30 crc kubenswrapper[4681]: E0122 09:32:30.453588 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:32:43 crc kubenswrapper[4681]: I0122 09:32:43.453354 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:32:43 crc kubenswrapper[4681]: E0122 09:32:43.454222 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:32:54 crc kubenswrapper[4681]: I0122 09:32:54.455044 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:32:54 crc kubenswrapper[4681]: E0122 09:32:54.456124 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:33:07 crc kubenswrapper[4681]: I0122 09:33:07.453027 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:33:07 crc kubenswrapper[4681]: E0122 09:33:07.453988 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:33:19 crc kubenswrapper[4681]: I0122 09:33:19.461361 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:33:19 crc kubenswrapper[4681]: E0122 09:33:19.464147 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:33:31 crc kubenswrapper[4681]: I0122 09:33:31.453465 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:33:31 crc kubenswrapper[4681]: E0122 09:33:31.454552 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:33:42 crc kubenswrapper[4681]: I0122 09:33:42.453323 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:33:42 crc kubenswrapper[4681]: E0122 09:33:42.454211 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:33:57 crc kubenswrapper[4681]: I0122 09:33:57.454557 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:33:57 crc kubenswrapper[4681]: I0122 09:33:57.800946 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72"} Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.108711 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:41 crc kubenswrapper[4681]: E0122 09:34:41.109483 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7432284c-1fcf-4350-b488-a63f3737d3d7" containerName="collect-profiles" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.109496 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7432284c-1fcf-4350-b488-a63f3737d3d7" containerName="collect-profiles" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.109643 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7432284c-1fcf-4350-b488-a63f3737d3d7" containerName="collect-profiles" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.110045 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.122322 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.251160 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsz97\" (UniqueName: \"kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97\") pod \"service-telemetry-framework-operators-ws6xn\" (UID: \"38c5661d-35c9-4a26-a697-1797743e610f\") " pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.352907 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsz97\" (UniqueName: \"kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97\") pod \"service-telemetry-framework-operators-ws6xn\" (UID: \"38c5661d-35c9-4a26-a697-1797743e610f\") " pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.372760 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsz97\" (UniqueName: \"kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97\") pod \"service-telemetry-framework-operators-ws6xn\" (UID: \"38c5661d-35c9-4a26-a697-1797743e610f\") " pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.466802 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.745325 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:41 crc kubenswrapper[4681]: I0122 09:34:41.766343 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:34:42 crc kubenswrapper[4681]: I0122 09:34:42.184966 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" event={"ID":"38c5661d-35c9-4a26-a697-1797743e610f","Type":"ContainerStarted","Data":"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805"} Jan 22 09:34:42 crc kubenswrapper[4681]: I0122 09:34:42.185465 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" event={"ID":"38c5661d-35c9-4a26-a697-1797743e610f","Type":"ContainerStarted","Data":"25ae5314255ead370cdf4a924279908a9af98aaa8e85d5bf0bff2a3eb5311482"} Jan 22 09:34:42 crc kubenswrapper[4681]: I0122 09:34:42.235377 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" podStartSLOduration=1.122601702 podStartE2EDuration="1.235239649s" podCreationTimestamp="2026-01-22 09:34:41 +0000 UTC" firstStartedPulling="2026-01-22 09:34:41.76611747 +0000 UTC m=+1872.592027965" lastFinishedPulling="2026-01-22 09:34:41.878755387 +0000 UTC m=+1872.704665912" observedRunningTime="2026-01-22 09:34:42.227013042 +0000 UTC m=+1873.052923567" watchObservedRunningTime="2026-01-22 09:34:42.235239649 +0000 UTC m=+1873.061150204" Jan 22 09:34:51 crc kubenswrapper[4681]: I0122 09:34:51.471378 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:51 crc kubenswrapper[4681]: I0122 09:34:51.471799 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:51 crc kubenswrapper[4681]: I0122 09:34:51.521172 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:52 crc kubenswrapper[4681]: I0122 09:34:52.314914 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:52 crc kubenswrapper[4681]: I0122 09:34:52.362895 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:54 crc kubenswrapper[4681]: I0122 09:34:54.276957 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" podUID="38c5661d-35c9-4a26-a697-1797743e610f" containerName="registry-server" containerID="cri-o://dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805" gracePeriod=2 Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.184908 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.280116 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsz97\" (UniqueName: \"kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97\") pod \"38c5661d-35c9-4a26-a697-1797743e610f\" (UID: \"38c5661d-35c9-4a26-a697-1797743e610f\") " Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.285376 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97" (OuterVolumeSpecName: "kube-api-access-nsz97") pod "38c5661d-35c9-4a26-a697-1797743e610f" (UID: "38c5661d-35c9-4a26-a697-1797743e610f"). InnerVolumeSpecName "kube-api-access-nsz97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.292740 4681 generic.go:334] "Generic (PLEG): container finished" podID="38c5661d-35c9-4a26-a697-1797743e610f" containerID="dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805" exitCode=0 Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.292784 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" event={"ID":"38c5661d-35c9-4a26-a697-1797743e610f","Type":"ContainerDied","Data":"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805"} Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.292815 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" event={"ID":"38c5661d-35c9-4a26-a697-1797743e610f","Type":"ContainerDied","Data":"25ae5314255ead370cdf4a924279908a9af98aaa8e85d5bf0bff2a3eb5311482"} Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.292835 4681 scope.go:117] "RemoveContainer" containerID="dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.292957 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-ws6xn" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.326506 4681 scope.go:117] "RemoveContainer" containerID="dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805" Jan 22 09:34:55 crc kubenswrapper[4681]: E0122 09:34:55.327243 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805\": container with ID starting with dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805 not found: ID does not exist" containerID="dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.327351 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805"} err="failed to get container status \"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805\": rpc error: code = NotFound desc = could not find container \"dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805\": container with ID starting with dd649dadb8edc8d5251dbedc9f7b8ec3c37d47d6a8a140d158ce69520a256805 not found: ID does not exist" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.332150 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.336732 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-ws6xn"] Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.381696 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsz97\" (UniqueName: \"kubernetes.io/projected/38c5661d-35c9-4a26-a697-1797743e610f-kube-api-access-nsz97\") on node \"crc\" DevicePath \"\"" Jan 22 09:34:55 crc kubenswrapper[4681]: I0122 09:34:55.467591 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c5661d-35c9-4a26-a697-1797743e610f" path="/var/lib/kubelet/pods/38c5661d-35c9-4a26-a697-1797743e610f/volumes" Jan 22 09:36:26 crc kubenswrapper[4681]: I0122 09:36:26.033069 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:36:26 crc kubenswrapper[4681]: I0122 09:36:26.033768 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:36:56 crc kubenswrapper[4681]: I0122 09:36:56.031675 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:36:56 crc kubenswrapper[4681]: I0122 09:36:56.032175 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.071055 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:01 crc kubenswrapper[4681]: E0122 09:37:01.071937 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c5661d-35c9-4a26-a697-1797743e610f" containerName="registry-server" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.071953 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c5661d-35c9-4a26-a697-1797743e610f" containerName="registry-server" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.072122 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c5661d-35c9-4a26-a697-1797743e610f" containerName="registry-server" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.073170 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.091500 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.115016 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.115121 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.115195 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gbq\" (UniqueName: \"kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.216799 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gbq\" (UniqueName: \"kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.216897 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.216943 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.217681 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.217690 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.244857 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gbq\" (UniqueName: \"kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq\") pod \"community-operators-rwvx8\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.407614 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:01 crc kubenswrapper[4681]: I0122 09:37:01.874522 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:02 crc kubenswrapper[4681]: I0122 09:37:02.523483 4681 generic.go:334] "Generic (PLEG): container finished" podID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerID="0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e" exitCode=0 Jan 22 09:37:02 crc kubenswrapper[4681]: I0122 09:37:02.523520 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerDied","Data":"0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e"} Jan 22 09:37:02 crc kubenswrapper[4681]: I0122 09:37:02.523838 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerStarted","Data":"faed9e5e17c25ef1d793d44bbd7bc93fe3a7760df96d95735940231dcda07b8e"} Jan 22 09:37:04 crc kubenswrapper[4681]: I0122 09:37:04.542672 4681 generic.go:334] "Generic (PLEG): container finished" podID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerID="d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b" exitCode=0 Jan 22 09:37:04 crc kubenswrapper[4681]: I0122 09:37:04.542770 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerDied","Data":"d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b"} Jan 22 09:37:05 crc kubenswrapper[4681]: I0122 09:37:05.552364 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerStarted","Data":"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b"} Jan 22 09:37:05 crc kubenswrapper[4681]: I0122 09:37:05.575900 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwvx8" podStartSLOduration=2.00419288 podStartE2EDuration="4.57588291s" podCreationTimestamp="2026-01-22 09:37:01 +0000 UTC" firstStartedPulling="2026-01-22 09:37:02.525049707 +0000 UTC m=+2013.350960212" lastFinishedPulling="2026-01-22 09:37:05.096739737 +0000 UTC m=+2015.922650242" observedRunningTime="2026-01-22 09:37:05.570765984 +0000 UTC m=+2016.396676499" watchObservedRunningTime="2026-01-22 09:37:05.57588291 +0000 UTC m=+2016.401793415" Jan 22 09:37:11 crc kubenswrapper[4681]: I0122 09:37:11.408422 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:11 crc kubenswrapper[4681]: I0122 09:37:11.410731 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:11 crc kubenswrapper[4681]: I0122 09:37:11.717538 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:11 crc kubenswrapper[4681]: I0122 09:37:11.717627 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:11 crc kubenswrapper[4681]: I0122 09:37:11.829117 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:13 crc kubenswrapper[4681]: I0122 09:37:13.607717 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwvx8" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="registry-server" containerID="cri-o://00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b" gracePeriod=2 Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.056312 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.125210 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content\") pod \"d04c351a-6628-46be-a342-b44ed7d0cb4d\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.125405 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities\") pod \"d04c351a-6628-46be-a342-b44ed7d0cb4d\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.125458 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gbq\" (UniqueName: \"kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq\") pod \"d04c351a-6628-46be-a342-b44ed7d0cb4d\" (UID: \"d04c351a-6628-46be-a342-b44ed7d0cb4d\") " Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.128227 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities" (OuterVolumeSpecName: "utilities") pod "d04c351a-6628-46be-a342-b44ed7d0cb4d" (UID: "d04c351a-6628-46be-a342-b44ed7d0cb4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.132979 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq" (OuterVolumeSpecName: "kube-api-access-b2gbq") pod "d04c351a-6628-46be-a342-b44ed7d0cb4d" (UID: "d04c351a-6628-46be-a342-b44ed7d0cb4d"). InnerVolumeSpecName "kube-api-access-b2gbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.190128 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d04c351a-6628-46be-a342-b44ed7d0cb4d" (UID: "d04c351a-6628-46be-a342-b44ed7d0cb4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.227286 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.227318 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d04c351a-6628-46be-a342-b44ed7d0cb4d-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.227327 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gbq\" (UniqueName: \"kubernetes.io/projected/d04c351a-6628-46be-a342-b44ed7d0cb4d-kube-api-access-b2gbq\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.622028 4681 generic.go:334] "Generic (PLEG): container finished" podID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerID="00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b" exitCode=0 Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.622131 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwvx8" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.622124 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerDied","Data":"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b"} Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.622228 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwvx8" event={"ID":"d04c351a-6628-46be-a342-b44ed7d0cb4d","Type":"ContainerDied","Data":"faed9e5e17c25ef1d793d44bbd7bc93fe3a7760df96d95735940231dcda07b8e"} Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.622272 4681 scope.go:117] "RemoveContainer" containerID="00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.649022 4681 scope.go:117] "RemoveContainer" containerID="d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.678243 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.688687 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwvx8"] Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.694240 4681 scope.go:117] "RemoveContainer" containerID="0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.754360 4681 scope.go:117] "RemoveContainer" containerID="00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b" Jan 22 09:37:14 crc kubenswrapper[4681]: E0122 09:37:14.757750 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b\": container with ID starting with 00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b not found: ID does not exist" containerID="00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.757806 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b"} err="failed to get container status \"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b\": rpc error: code = NotFound desc = could not find container \"00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b\": container with ID starting with 00da3a10377405ee9c8827c4b9dea004755007e562d4806627c205424b62d27b not found: ID does not exist" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.757835 4681 scope.go:117] "RemoveContainer" containerID="d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b" Jan 22 09:37:14 crc kubenswrapper[4681]: E0122 09:37:14.758371 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b\": container with ID starting with d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b not found: ID does not exist" containerID="d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.758445 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b"} err="failed to get container status \"d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b\": rpc error: code = NotFound desc = could not find container \"d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b\": container with ID starting with d18b7661016c2c9a07343b158b9187817568a1ed0e3672a9b30a4e683846983b not found: ID does not exist" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.758490 4681 scope.go:117] "RemoveContainer" containerID="0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e" Jan 22 09:37:14 crc kubenswrapper[4681]: E0122 09:37:14.758850 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e\": container with ID starting with 0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e not found: ID does not exist" containerID="0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e" Jan 22 09:37:14 crc kubenswrapper[4681]: I0122 09:37:14.758883 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e"} err="failed to get container status \"0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e\": rpc error: code = NotFound desc = could not find container \"0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e\": container with ID starting with 0527e1fb4bf2fc3b876ea2d2da6ea11c61dc5007b357f1c9a1c89dfbf3bb019e not found: ID does not exist" Jan 22 09:37:15 crc kubenswrapper[4681]: I0122 09:37:15.467875 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" path="/var/lib/kubelet/pods/d04c351a-6628-46be-a342-b44ed7d0cb4d/volumes" Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.049109 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.049669 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.049721 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.050351 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.050419 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72" gracePeriod=600 Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.734033 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72" exitCode=0 Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.734096 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72"} Jan 22 09:37:26 crc kubenswrapper[4681]: I0122 09:37:26.734162 4681 scope.go:117] "RemoveContainer" containerID="1ac09fb8ba3e26fbfedb50fc6d4d1805006b3e3c07b98978485b52bfdf3a259e" Jan 22 09:37:27 crc kubenswrapper[4681]: I0122 09:37:27.742236 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd"} Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.125355 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:33 crc kubenswrapper[4681]: E0122 09:37:33.127121 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="extract-content" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.127208 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="extract-content" Jan 22 09:37:33 crc kubenswrapper[4681]: E0122 09:37:33.127325 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="registry-server" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.127396 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="registry-server" Jan 22 09:37:33 crc kubenswrapper[4681]: E0122 09:37:33.127458 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="extract-utilities" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.127518 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="extract-utilities" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.127687 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04c351a-6628-46be-a342-b44ed7d0cb4d" containerName="registry-server" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.128672 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.140829 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.154167 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.154207 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvj7z\" (UniqueName: \"kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.154306 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.255155 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.255249 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvj7z\" (UniqueName: \"kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.255396 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.255880 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.255913 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.273924 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvj7z\" (UniqueName: \"kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z\") pod \"redhat-operators-5px7g\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.457794 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.710848 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:33 crc kubenswrapper[4681]: I0122 09:37:33.790229 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerStarted","Data":"e907b369a167e2bf8e477157b9614c2bc5509db6d4c3ed14249af7c1541a1d41"} Jan 22 09:37:34 crc kubenswrapper[4681]: I0122 09:37:34.799710 4681 generic.go:334] "Generic (PLEG): container finished" podID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerID="1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56" exitCode=0 Jan 22 09:37:34 crc kubenswrapper[4681]: I0122 09:37:34.799831 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerDied","Data":"1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56"} Jan 22 09:37:35 crc kubenswrapper[4681]: I0122 09:37:35.815525 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerStarted","Data":"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9"} Jan 22 09:37:37 crc kubenswrapper[4681]: I0122 09:37:37.835158 4681 generic.go:334] "Generic (PLEG): container finished" podID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerID="b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9" exitCode=0 Jan 22 09:37:37 crc kubenswrapper[4681]: I0122 09:37:37.835218 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerDied","Data":"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9"} Jan 22 09:37:40 crc kubenswrapper[4681]: I0122 09:37:40.864911 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerStarted","Data":"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542"} Jan 22 09:37:40 crc kubenswrapper[4681]: I0122 09:37:40.902504 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5px7g" podStartSLOduration=2.440587581 podStartE2EDuration="7.902487175s" podCreationTimestamp="2026-01-22 09:37:33 +0000 UTC" firstStartedPulling="2026-01-22 09:37:34.801956124 +0000 UTC m=+2045.627866639" lastFinishedPulling="2026-01-22 09:37:40.263855718 +0000 UTC m=+2051.089766233" observedRunningTime="2026-01-22 09:37:40.897594645 +0000 UTC m=+2051.723505190" watchObservedRunningTime="2026-01-22 09:37:40.902487175 +0000 UTC m=+2051.728397690" Jan 22 09:37:43 crc kubenswrapper[4681]: I0122 09:37:43.466159 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:43 crc kubenswrapper[4681]: I0122 09:37:43.466616 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:44 crc kubenswrapper[4681]: I0122 09:37:44.522597 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5px7g" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="registry-server" probeResult="failure" output=< Jan 22 09:37:44 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:37:44 crc kubenswrapper[4681]: > Jan 22 09:37:53 crc kubenswrapper[4681]: I0122 09:37:53.527802 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:53 crc kubenswrapper[4681]: I0122 09:37:53.600408 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:53 crc kubenswrapper[4681]: I0122 09:37:53.779748 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:54 crc kubenswrapper[4681]: I0122 09:37:54.991230 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5px7g" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="registry-server" containerID="cri-o://2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542" gracePeriod=2 Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.440349 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.619183 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities\") pod \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.619318 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvj7z\" (UniqueName: \"kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z\") pod \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.619361 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content\") pod \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\" (UID: \"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da\") " Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.621354 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities" (OuterVolumeSpecName: "utilities") pod "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" (UID: "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.627076 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z" (OuterVolumeSpecName: "kube-api-access-nvj7z") pod "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" (UID: "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da"). InnerVolumeSpecName "kube-api-access-nvj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.721418 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.721472 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvj7z\" (UniqueName: \"kubernetes.io/projected/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-kube-api-access-nvj7z\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.741116 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" (UID: "5f28b8e6-a5cb-4d13-80c6-0a983f26f0da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:37:55 crc kubenswrapper[4681]: I0122 09:37:55.822909 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.002691 4681 generic.go:334] "Generic (PLEG): container finished" podID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerID="2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542" exitCode=0 Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.002734 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerDied","Data":"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542"} Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.002760 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5px7g" event={"ID":"5f28b8e6-a5cb-4d13-80c6-0a983f26f0da","Type":"ContainerDied","Data":"e907b369a167e2bf8e477157b9614c2bc5509db6d4c3ed14249af7c1541a1d41"} Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.002780 4681 scope.go:117] "RemoveContainer" containerID="2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.002905 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5px7g" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.045560 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.052691 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5px7g"] Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.064019 4681 scope.go:117] "RemoveContainer" containerID="b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.081543 4681 scope.go:117] "RemoveContainer" containerID="1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.107714 4681 scope.go:117] "RemoveContainer" containerID="2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542" Jan 22 09:37:56 crc kubenswrapper[4681]: E0122 09:37:56.108256 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542\": container with ID starting with 2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542 not found: ID does not exist" containerID="2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.108395 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542"} err="failed to get container status \"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542\": rpc error: code = NotFound desc = could not find container \"2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542\": container with ID starting with 2331b97cc6480482f993bbc16b673284b7243f01bdde41bf8e619cd65ed80542 not found: ID does not exist" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.108434 4681 scope.go:117] "RemoveContainer" containerID="b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9" Jan 22 09:37:56 crc kubenswrapper[4681]: E0122 09:37:56.108772 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9\": container with ID starting with b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9 not found: ID does not exist" containerID="b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.108809 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9"} err="failed to get container status \"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9\": rpc error: code = NotFound desc = could not find container \"b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9\": container with ID starting with b4d2d0c8802f5bdeccc7d4115abbca82d50a0fe11861997022282c465422eaf9 not found: ID does not exist" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.108835 4681 scope.go:117] "RemoveContainer" containerID="1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56" Jan 22 09:37:56 crc kubenswrapper[4681]: E0122 09:37:56.109450 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56\": container with ID starting with 1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56 not found: ID does not exist" containerID="1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56" Jan 22 09:37:56 crc kubenswrapper[4681]: I0122 09:37:56.109494 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56"} err="failed to get container status \"1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56\": rpc error: code = NotFound desc = could not find container \"1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56\": container with ID starting with 1392fcdfc34bca34274d7676616cbdf085c808da57fa235585905ea740b4af56 not found: ID does not exist" Jan 22 09:37:57 crc kubenswrapper[4681]: I0122 09:37:57.468142 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" path="/var/lib/kubelet/pods/5f28b8e6-a5cb-4d13-80c6-0a983f26f0da/volumes" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.826207 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:01 crc kubenswrapper[4681]: E0122 09:38:01.827017 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="extract-utilities" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.827041 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="extract-utilities" Jan 22 09:38:01 crc kubenswrapper[4681]: E0122 09:38:01.827067 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="extract-content" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.827081 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="extract-content" Jan 22 09:38:01 crc kubenswrapper[4681]: E0122 09:38:01.827100 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="registry-server" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.827114 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="registry-server" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.829338 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f28b8e6-a5cb-4d13-80c6-0a983f26f0da" containerName="registry-server" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.831016 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.844591 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.943007 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzght\" (UniqueName: \"kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.943073 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:01 crc kubenswrapper[4681]: I0122 09:38:01.943135 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.044826 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzght\" (UniqueName: \"kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.044916 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.044995 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.045581 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.046028 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.078599 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzght\" (UniqueName: \"kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght\") pod \"certified-operators-9zc9r\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.159717 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:02 crc kubenswrapper[4681]: I0122 09:38:02.672565 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:02 crc kubenswrapper[4681]: W0122 09:38:02.675405 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c5af92_a305_49a2_be71_4c873555d70e.slice/crio-dcac9f2a8fe221c5639ca74684b8bc04b2e6028ebbfd9ebca00c411b08e02e0b WatchSource:0}: Error finding container dcac9f2a8fe221c5639ca74684b8bc04b2e6028ebbfd9ebca00c411b08e02e0b: Status 404 returned error can't find the container with id dcac9f2a8fe221c5639ca74684b8bc04b2e6028ebbfd9ebca00c411b08e02e0b Jan 22 09:38:03 crc kubenswrapper[4681]: I0122 09:38:03.085855 4681 generic.go:334] "Generic (PLEG): container finished" podID="87c5af92-a305-49a2-be71-4c873555d70e" containerID="553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624" exitCode=0 Jan 22 09:38:03 crc kubenswrapper[4681]: I0122 09:38:03.085901 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerDied","Data":"553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624"} Jan 22 09:38:03 crc kubenswrapper[4681]: I0122 09:38:03.085929 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerStarted","Data":"dcac9f2a8fe221c5639ca74684b8bc04b2e6028ebbfd9ebca00c411b08e02e0b"} Jan 22 09:38:05 crc kubenswrapper[4681]: I0122 09:38:05.104634 4681 generic.go:334] "Generic (PLEG): container finished" podID="87c5af92-a305-49a2-be71-4c873555d70e" containerID="ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3" exitCode=0 Jan 22 09:38:05 crc kubenswrapper[4681]: I0122 09:38:05.104752 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerDied","Data":"ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3"} Jan 22 09:38:06 crc kubenswrapper[4681]: I0122 09:38:06.116109 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerStarted","Data":"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e"} Jan 22 09:38:06 crc kubenswrapper[4681]: I0122 09:38:06.139212 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zc9r" podStartSLOduration=2.708535999 podStartE2EDuration="5.139197818s" podCreationTimestamp="2026-01-22 09:38:01 +0000 UTC" firstStartedPulling="2026-01-22 09:38:03.087762217 +0000 UTC m=+2073.913672722" lastFinishedPulling="2026-01-22 09:38:05.518424026 +0000 UTC m=+2076.344334541" observedRunningTime="2026-01-22 09:38:06.138655714 +0000 UTC m=+2076.964566219" watchObservedRunningTime="2026-01-22 09:38:06.139197818 +0000 UTC m=+2076.965108323" Jan 22 09:38:12 crc kubenswrapper[4681]: I0122 09:38:12.160461 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:12 crc kubenswrapper[4681]: I0122 09:38:12.160798 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:12 crc kubenswrapper[4681]: I0122 09:38:12.236859 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:12 crc kubenswrapper[4681]: I0122 09:38:12.316830 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:12 crc kubenswrapper[4681]: I0122 09:38:12.489389 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.188995 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9zc9r" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="registry-server" containerID="cri-o://4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e" gracePeriod=2 Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.730287 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.842561 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities\") pod \"87c5af92-a305-49a2-be71-4c873555d70e\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.842696 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content\") pod \"87c5af92-a305-49a2-be71-4c873555d70e\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.842881 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzght\" (UniqueName: \"kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght\") pod \"87c5af92-a305-49a2-be71-4c873555d70e\" (UID: \"87c5af92-a305-49a2-be71-4c873555d70e\") " Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.843606 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities" (OuterVolumeSpecName: "utilities") pod "87c5af92-a305-49a2-be71-4c873555d70e" (UID: "87c5af92-a305-49a2-be71-4c873555d70e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.849422 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght" (OuterVolumeSpecName: "kube-api-access-bzght") pod "87c5af92-a305-49a2-be71-4c873555d70e" (UID: "87c5af92-a305-49a2-be71-4c873555d70e"). InnerVolumeSpecName "kube-api-access-bzght". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.925618 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87c5af92-a305-49a2-be71-4c873555d70e" (UID: "87c5af92-a305-49a2-be71-4c873555d70e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.944985 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzght\" (UniqueName: \"kubernetes.io/projected/87c5af92-a305-49a2-be71-4c873555d70e-kube-api-access-bzght\") on node \"crc\" DevicePath \"\"" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.945020 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:38:14 crc kubenswrapper[4681]: I0122 09:38:14.945031 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87c5af92-a305-49a2-be71-4c873555d70e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.197940 4681 generic.go:334] "Generic (PLEG): container finished" podID="87c5af92-a305-49a2-be71-4c873555d70e" containerID="4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e" exitCode=0 Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.197994 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerDied","Data":"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e"} Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.198040 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zc9r" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.198062 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zc9r" event={"ID":"87c5af92-a305-49a2-be71-4c873555d70e","Type":"ContainerDied","Data":"dcac9f2a8fe221c5639ca74684b8bc04b2e6028ebbfd9ebca00c411b08e02e0b"} Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.198081 4681 scope.go:117] "RemoveContainer" containerID="4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.239931 4681 scope.go:117] "RemoveContainer" containerID="ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.243458 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.247600 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9zc9r"] Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.278628 4681 scope.go:117] "RemoveContainer" containerID="553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.302206 4681 scope.go:117] "RemoveContainer" containerID="4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e" Jan 22 09:38:15 crc kubenswrapper[4681]: E0122 09:38:15.302917 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e\": container with ID starting with 4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e not found: ID does not exist" containerID="4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.302988 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e"} err="failed to get container status \"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e\": rpc error: code = NotFound desc = could not find container \"4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e\": container with ID starting with 4d13493c6cb416ca58f414467eff4e3c58c1f775a530c41c2f7dedecca70e54e not found: ID does not exist" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.303031 4681 scope.go:117] "RemoveContainer" containerID="ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3" Jan 22 09:38:15 crc kubenswrapper[4681]: E0122 09:38:15.303643 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3\": container with ID starting with ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3 not found: ID does not exist" containerID="ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.303691 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3"} err="failed to get container status \"ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3\": rpc error: code = NotFound desc = could not find container \"ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3\": container with ID starting with ca36b324741cc3950d7466239a7a1c2c6170c02fd22f03ae2844914432fdc0b3 not found: ID does not exist" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.303725 4681 scope.go:117] "RemoveContainer" containerID="553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624" Jan 22 09:38:15 crc kubenswrapper[4681]: E0122 09:38:15.304377 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624\": container with ID starting with 553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624 not found: ID does not exist" containerID="553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.304440 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624"} err="failed to get container status \"553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624\": rpc error: code = NotFound desc = could not find container \"553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624\": container with ID starting with 553ed1e489dd01d3869b1a5bd6d8aef71514969dd5cace27c9acd0cff60c7624 not found: ID does not exist" Jan 22 09:38:15 crc kubenswrapper[4681]: I0122 09:38:15.483783 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c5af92-a305-49a2-be71-4c873555d70e" path="/var/lib/kubelet/pods/87c5af92-a305-49a2-be71-4c873555d70e/volumes" Jan 22 09:39:56 crc kubenswrapper[4681]: I0122 09:39:56.035078 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:39:56 crc kubenswrapper[4681]: I0122 09:39:56.035740 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.766058 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:40:00 crc kubenswrapper[4681]: E0122 09:40:00.767015 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="registry-server" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.767035 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="registry-server" Jan 22 09:40:00 crc kubenswrapper[4681]: E0122 09:40:00.767068 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="extract-utilities" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.767081 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="extract-utilities" Jan 22 09:40:00 crc kubenswrapper[4681]: E0122 09:40:00.767100 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="extract-content" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.767114 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="extract-content" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.767355 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c5af92-a305-49a2-be71-4c873555d70e" containerName="registry-server" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.768074 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.780588 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.881083 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7bsl\" (UniqueName: \"kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl\") pod \"service-telemetry-framework-operators-px426\" (UID: \"3732213d-7ec8-4590-b0bf-8dfaa1a70115\") " pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:40:00 crc kubenswrapper[4681]: I0122 09:40:00.982796 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7bsl\" (UniqueName: \"kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl\") pod \"service-telemetry-framework-operators-px426\" (UID: \"3732213d-7ec8-4590-b0bf-8dfaa1a70115\") " pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:40:01 crc kubenswrapper[4681]: I0122 09:40:01.023788 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7bsl\" (UniqueName: \"kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl\") pod \"service-telemetry-framework-operators-px426\" (UID: \"3732213d-7ec8-4590-b0bf-8dfaa1a70115\") " pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:40:01 crc kubenswrapper[4681]: I0122 09:40:01.091571 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:40:22 crc kubenswrapper[4681]: I0122 09:40:22.711578 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:40:22 crc kubenswrapper[4681]: I0122 09:40:22.732290 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:40:23 crc kubenswrapper[4681]: I0122 09:40:23.523627 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-px426" event={"ID":"3732213d-7ec8-4590-b0bf-8dfaa1a70115","Type":"ContainerStarted","Data":"fe75cab82e269633fb129262b4a4450fa4e0dd42f7526028f0e3be130b4f2304"} Jan 22 09:40:26 crc kubenswrapper[4681]: I0122 09:40:26.031361 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:40:26 crc kubenswrapper[4681]: I0122 09:40:26.031788 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:40:56 crc kubenswrapper[4681]: I0122 09:40:56.031609 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:40:56 crc kubenswrapper[4681]: I0122 09:40:56.032336 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:40:56 crc kubenswrapper[4681]: I0122 09:40:56.032420 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:40:56 crc kubenswrapper[4681]: I0122 09:40:56.033343 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:40:56 crc kubenswrapper[4681]: I0122 09:40:56.033459 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" gracePeriod=600 Jan 22 09:41:04 crc kubenswrapper[4681]: I0122 09:41:04.589813 4681 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-daemon-zb7wn_d58a61a8-a6b2-4af6-92a6-c7bf6da6a432/machine-config-daemon/9.log" Jan 22 09:41:04 crc kubenswrapper[4681]: I0122 09:41:04.591838 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" exitCode=-1 Jan 22 09:41:04 crc kubenswrapper[4681]: I0122 09:41:04.591895 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd"} Jan 22 09:41:04 crc kubenswrapper[4681]: I0122 09:41:04.591983 4681 scope.go:117] "RemoveContainer" containerID="e1c94f07f44d5de719a64c7ab20f849f2e3e6362d4e3f454de26b2a29c719d72" Jan 22 09:41:36 crc kubenswrapper[4681]: E0122 09:41:36.492325 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:41:36 crc kubenswrapper[4681]: I0122 09:41:36.918508 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:41:36 crc kubenswrapper[4681]: E0122 09:41:36.919007 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:41:38 crc kubenswrapper[4681]: I0122 09:41:38.936848 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-px426" event={"ID":"3732213d-7ec8-4590-b0bf-8dfaa1a70115","Type":"ContainerStarted","Data":"fa52b80ab18ca439274012859ac228b8db2b14e0c4d5b4fa11e4354d81f45f1e"} Jan 22 09:41:38 crc kubenswrapper[4681]: I0122 09:41:38.961773 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-px426" podStartSLOduration=24.097162499 podStartE2EDuration="1m38.961748715s" podCreationTimestamp="2026-01-22 09:40:00 +0000 UTC" firstStartedPulling="2026-01-22 09:40:22.731983303 +0000 UTC m=+2213.557893808" lastFinishedPulling="2026-01-22 09:41:37.596569489 +0000 UTC m=+2288.422480024" observedRunningTime="2026-01-22 09:41:38.952827588 +0000 UTC m=+2289.778738133" watchObservedRunningTime="2026-01-22 09:41:38.961748715 +0000 UTC m=+2289.787659230" Jan 22 09:41:41 crc kubenswrapper[4681]: I0122 09:41:41.092466 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:41 crc kubenswrapper[4681]: I0122 09:41:41.092572 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:41 crc kubenswrapper[4681]: I0122 09:41:41.123901 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:47 crc kubenswrapper[4681]: I0122 09:41:47.453381 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:41:47 crc kubenswrapper[4681]: E0122 09:41:47.453893 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:41:51 crc kubenswrapper[4681]: I0122 09:41:51.127881 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:51 crc kubenswrapper[4681]: I0122 09:41:51.176660 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:41:52 crc kubenswrapper[4681]: I0122 09:41:52.106041 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-px426" podUID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" containerName="registry-server" containerID="cri-o://fa52b80ab18ca439274012859ac228b8db2b14e0c4d5b4fa11e4354d81f45f1e" gracePeriod=2 Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.122344 4681 generic.go:334] "Generic (PLEG): container finished" podID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" containerID="fa52b80ab18ca439274012859ac228b8db2b14e0c4d5b4fa11e4354d81f45f1e" exitCode=0 Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.122391 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-px426" event={"ID":"3732213d-7ec8-4590-b0bf-8dfaa1a70115","Type":"ContainerDied","Data":"fa52b80ab18ca439274012859ac228b8db2b14e0c4d5b4fa11e4354d81f45f1e"} Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.277255 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.403959 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7bsl\" (UniqueName: \"kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl\") pod \"3732213d-7ec8-4590-b0bf-8dfaa1a70115\" (UID: \"3732213d-7ec8-4590-b0bf-8dfaa1a70115\") " Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.425367 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl" (OuterVolumeSpecName: "kube-api-access-p7bsl") pod "3732213d-7ec8-4590-b0bf-8dfaa1a70115" (UID: "3732213d-7ec8-4590-b0bf-8dfaa1a70115"). InnerVolumeSpecName "kube-api-access-p7bsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:41:53 crc kubenswrapper[4681]: I0122 09:41:53.505538 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7bsl\" (UniqueName: \"kubernetes.io/projected/3732213d-7ec8-4590-b0bf-8dfaa1a70115-kube-api-access-p7bsl\") on node \"crc\" DevicePath \"\"" Jan 22 09:41:54 crc kubenswrapper[4681]: I0122 09:41:54.134947 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-px426" event={"ID":"3732213d-7ec8-4590-b0bf-8dfaa1a70115","Type":"ContainerDied","Data":"fe75cab82e269633fb129262b4a4450fa4e0dd42f7526028f0e3be130b4f2304"} Jan 22 09:41:54 crc kubenswrapper[4681]: I0122 09:41:54.135000 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-px426" Jan 22 09:41:54 crc kubenswrapper[4681]: I0122 09:41:54.135027 4681 scope.go:117] "RemoveContainer" containerID="fa52b80ab18ca439274012859ac228b8db2b14e0c4d5b4fa11e4354d81f45f1e" Jan 22 09:41:54 crc kubenswrapper[4681]: I0122 09:41:54.171258 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:41:54 crc kubenswrapper[4681]: I0122 09:41:54.174995 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-px426"] Jan 22 09:41:55 crc kubenswrapper[4681]: I0122 09:41:55.460957 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" path="/var/lib/kubelet/pods/3732213d-7ec8-4590-b0bf-8dfaa1a70115/volumes" Jan 22 09:42:00 crc kubenswrapper[4681]: I0122 09:42:00.452936 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:42:00 crc kubenswrapper[4681]: E0122 09:42:00.453552 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:42:14 crc kubenswrapper[4681]: I0122 09:42:14.453023 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:42:14 crc kubenswrapper[4681]: E0122 09:42:14.454152 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:42:27 crc kubenswrapper[4681]: I0122 09:42:27.452723 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:42:27 crc kubenswrapper[4681]: E0122 09:42:27.453576 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:42:39 crc kubenswrapper[4681]: I0122 09:42:39.460836 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:42:39 crc kubenswrapper[4681]: E0122 09:42:39.461484 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:42:50 crc kubenswrapper[4681]: I0122 09:42:50.454031 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:42:50 crc kubenswrapper[4681]: E0122 09:42:50.455114 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:43:01 crc kubenswrapper[4681]: I0122 09:43:01.452907 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:43:01 crc kubenswrapper[4681]: E0122 09:43:01.453626 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:43:16 crc kubenswrapper[4681]: I0122 09:43:16.452250 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:43:16 crc kubenswrapper[4681]: E0122 09:43:16.453303 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:43:29 crc kubenswrapper[4681]: I0122 09:43:29.460054 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:43:29 crc kubenswrapper[4681]: E0122 09:43:29.461129 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:43:43 crc kubenswrapper[4681]: I0122 09:43:43.453118 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:43:43 crc kubenswrapper[4681]: E0122 09:43:43.454084 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:43:54 crc kubenswrapper[4681]: I0122 09:43:54.453358 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:43:54 crc kubenswrapper[4681]: E0122 09:43:54.454435 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:44:08 crc kubenswrapper[4681]: I0122 09:44:08.453072 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:44:08 crc kubenswrapper[4681]: E0122 09:44:08.453823 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:44:19 crc kubenswrapper[4681]: I0122 09:44:19.455969 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:44:19 crc kubenswrapper[4681]: E0122 09:44:19.457353 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:44:31 crc kubenswrapper[4681]: I0122 09:44:31.452918 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:44:31 crc kubenswrapper[4681]: E0122 09:44:31.453749 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:44:44 crc kubenswrapper[4681]: I0122 09:44:44.452004 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:44:44 crc kubenswrapper[4681]: E0122 09:44:44.453023 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:44:59 crc kubenswrapper[4681]: I0122 09:44:59.470957 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:44:59 crc kubenswrapper[4681]: E0122 09:44:59.472682 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.158240 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth"] Jan 22 09:45:00 crc kubenswrapper[4681]: E0122 09:45:00.158655 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" containerName="registry-server" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.158684 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" containerName="registry-server" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.158939 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="3732213d-7ec8-4590-b0bf-8dfaa1a70115" containerName="registry-server" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.159655 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.164149 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.164215 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.174378 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth"] Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.335868 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7ht\" (UniqueName: \"kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.335976 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.336172 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.438085 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7ht\" (UniqueName: \"kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.438791 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.439057 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.440454 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.459507 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.470770 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7ht\" (UniqueName: \"kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht\") pod \"collect-profiles-29484585-mxvth\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:00 crc kubenswrapper[4681]: I0122 09:45:00.515433 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:01 crc kubenswrapper[4681]: I0122 09:45:01.001071 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth"] Jan 22 09:45:01 crc kubenswrapper[4681]: I0122 09:45:01.774899 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" event={"ID":"47a583c4-99f0-4abf-8c5b-14f705f1bbca","Type":"ContainerStarted","Data":"7a519585c61a0f116bd1e2c958e49b5ddf2d9f02f07f87bd6feb7aecb6f28796"} Jan 22 09:45:01 crc kubenswrapper[4681]: I0122 09:45:01.775240 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" event={"ID":"47a583c4-99f0-4abf-8c5b-14f705f1bbca","Type":"ContainerStarted","Data":"fe2984317df3c81cc7a176b714bcb9c84184cf0e338b478afb5eea16f1613d54"} Jan 22 09:45:01 crc kubenswrapper[4681]: I0122 09:45:01.797559 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" podStartSLOduration=1.797538933 podStartE2EDuration="1.797538933s" podCreationTimestamp="2026-01-22 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-22 09:45:01.793338121 +0000 UTC m=+2492.619248636" watchObservedRunningTime="2026-01-22 09:45:01.797538933 +0000 UTC m=+2492.623449448" Jan 22 09:45:02 crc kubenswrapper[4681]: I0122 09:45:02.783540 4681 generic.go:334] "Generic (PLEG): container finished" podID="47a583c4-99f0-4abf-8c5b-14f705f1bbca" containerID="7a519585c61a0f116bd1e2c958e49b5ddf2d9f02f07f87bd6feb7aecb6f28796" exitCode=0 Jan 22 09:45:02 crc kubenswrapper[4681]: I0122 09:45:02.783595 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" event={"ID":"47a583c4-99f0-4abf-8c5b-14f705f1bbca","Type":"ContainerDied","Data":"7a519585c61a0f116bd1e2c958e49b5ddf2d9f02f07f87bd6feb7aecb6f28796"} Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.041978 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.194096 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume\") pod \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.194296 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume\") pod \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.195093 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume" (OuterVolumeSpecName: "config-volume") pod "47a583c4-99f0-4abf-8c5b-14f705f1bbca" (UID: "47a583c4-99f0-4abf-8c5b-14f705f1bbca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.194417 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb7ht\" (UniqueName: \"kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht\") pod \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\" (UID: \"47a583c4-99f0-4abf-8c5b-14f705f1bbca\") " Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.195575 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a583c4-99f0-4abf-8c5b-14f705f1bbca-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.199290 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47a583c4-99f0-4abf-8c5b-14f705f1bbca" (UID: "47a583c4-99f0-4abf-8c5b-14f705f1bbca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.201275 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht" (OuterVolumeSpecName: "kube-api-access-gb7ht") pod "47a583c4-99f0-4abf-8c5b-14f705f1bbca" (UID: "47a583c4-99f0-4abf-8c5b-14f705f1bbca"). InnerVolumeSpecName "kube-api-access-gb7ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.296643 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a583c4-99f0-4abf-8c5b-14f705f1bbca-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.296683 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb7ht\" (UniqueName: \"kubernetes.io/projected/47a583c4-99f0-4abf-8c5b-14f705f1bbca-kube-api-access-gb7ht\") on node \"crc\" DevicePath \"\"" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.798713 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" event={"ID":"47a583c4-99f0-4abf-8c5b-14f705f1bbca","Type":"ContainerDied","Data":"fe2984317df3c81cc7a176b714bcb9c84184cf0e338b478afb5eea16f1613d54"} Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.799028 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe2984317df3c81cc7a176b714bcb9c84184cf0e338b478afb5eea16f1613d54" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.798753 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484585-mxvth" Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.882595 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h"] Jan 22 09:45:04 crc kubenswrapper[4681]: I0122 09:45:04.892990 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484540-h6x5h"] Jan 22 09:45:05 crc kubenswrapper[4681]: I0122 09:45:05.480814 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630a0503-a218-4ac5-b1db-01b76a08f5c1" path="/var/lib/kubelet/pods/630a0503-a218-4ac5-b1db-01b76a08f5c1/volumes" Jan 22 09:45:13 crc kubenswrapper[4681]: I0122 09:45:13.453128 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:45:13 crc kubenswrapper[4681]: E0122 09:45:13.453990 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:45:26 crc kubenswrapper[4681]: I0122 09:45:26.452950 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:45:26 crc kubenswrapper[4681]: E0122 09:45:26.453699 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:45:30 crc kubenswrapper[4681]: I0122 09:45:30.876231 4681 scope.go:117] "RemoveContainer" containerID="950577f6efd0f52f9114d15046386d4fedee5e59db9af4df2dd946762bca948c" Jan 22 09:45:39 crc kubenswrapper[4681]: I0122 09:45:39.459139 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:45:39 crc kubenswrapper[4681]: E0122 09:45:39.460362 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:45:51 crc kubenswrapper[4681]: I0122 09:45:51.456678 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:45:51 crc kubenswrapper[4681]: E0122 09:45:51.457711 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:46:05 crc kubenswrapper[4681]: I0122 09:46:05.452926 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:46:05 crc kubenswrapper[4681]: E0122 09:46:05.453842 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:46:16 crc kubenswrapper[4681]: I0122 09:46:16.453864 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:46:16 crc kubenswrapper[4681]: E0122 09:46:16.454982 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:46:28 crc kubenswrapper[4681]: I0122 09:46:28.452780 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:46:28 crc kubenswrapper[4681]: E0122 09:46:28.453757 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.478936 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:37 crc kubenswrapper[4681]: E0122 09:46:37.479752 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a583c4-99f0-4abf-8c5b-14f705f1bbca" containerName="collect-profiles" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.479764 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a583c4-99f0-4abf-8c5b-14f705f1bbca" containerName="collect-profiles" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.479880 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a583c4-99f0-4abf-8c5b-14f705f1bbca" containerName="collect-profiles" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.480429 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.496724 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.624815 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqss\" (UniqueName: \"kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss\") pod \"service-telemetry-framework-operators-m5k87\" (UID: \"0c629dda-d564-48c2-9f82-fe5d893bcf53\") " pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.727127 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqss\" (UniqueName: \"kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss\") pod \"service-telemetry-framework-operators-m5k87\" (UID: \"0c629dda-d564-48c2-9f82-fe5d893bcf53\") " pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.769309 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqss\" (UniqueName: \"kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss\") pod \"service-telemetry-framework-operators-m5k87\" (UID: \"0c629dda-d564-48c2-9f82-fe5d893bcf53\") " pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:37 crc kubenswrapper[4681]: I0122 09:46:37.816877 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:38 crc kubenswrapper[4681]: I0122 09:46:38.062428 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:38 crc kubenswrapper[4681]: I0122 09:46:38.080318 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:46:38 crc kubenswrapper[4681]: I0122 09:46:38.620530 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-m5k87" event={"ID":"0c629dda-d564-48c2-9f82-fe5d893bcf53","Type":"ContainerStarted","Data":"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c"} Jan 22 09:46:38 crc kubenswrapper[4681]: I0122 09:46:38.623051 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-m5k87" event={"ID":"0c629dda-d564-48c2-9f82-fe5d893bcf53","Type":"ContainerStarted","Data":"acd3e4be4170ea587e7f3aab054874132c456eaeaaf67e971c280b1a6ac9b870"} Jan 22 09:46:38 crc kubenswrapper[4681]: I0122 09:46:38.644643 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-m5k87" podStartSLOduration=1.53376231 podStartE2EDuration="1.644620638s" podCreationTimestamp="2026-01-22 09:46:37 +0000 UTC" firstStartedPulling="2026-01-22 09:46:38.080032215 +0000 UTC m=+2588.905942720" lastFinishedPulling="2026-01-22 09:46:38.190890503 +0000 UTC m=+2589.016801048" observedRunningTime="2026-01-22 09:46:38.638034954 +0000 UTC m=+2589.463945459" watchObservedRunningTime="2026-01-22 09:46:38.644620638 +0000 UTC m=+2589.470531143" Jan 22 09:46:43 crc kubenswrapper[4681]: I0122 09:46:43.453353 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:46:44 crc kubenswrapper[4681]: I0122 09:46:44.677220 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b"} Jan 22 09:46:47 crc kubenswrapper[4681]: I0122 09:46:47.817561 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:47 crc kubenswrapper[4681]: I0122 09:46:47.818335 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:47 crc kubenswrapper[4681]: I0122 09:46:47.867470 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:48 crc kubenswrapper[4681]: I0122 09:46:48.794139 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:48 crc kubenswrapper[4681]: I0122 09:46:48.866763 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:50 crc kubenswrapper[4681]: I0122 09:46:50.727670 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-m5k87" podUID="0c629dda-d564-48c2-9f82-fe5d893bcf53" containerName="registry-server" containerID="cri-o://2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c" gracePeriod=2 Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.127816 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.279058 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmqss\" (UniqueName: \"kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss\") pod \"0c629dda-d564-48c2-9f82-fe5d893bcf53\" (UID: \"0c629dda-d564-48c2-9f82-fe5d893bcf53\") " Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.309191 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss" (OuterVolumeSpecName: "kube-api-access-rmqss") pod "0c629dda-d564-48c2-9f82-fe5d893bcf53" (UID: "0c629dda-d564-48c2-9f82-fe5d893bcf53"). InnerVolumeSpecName "kube-api-access-rmqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.381088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmqss\" (UniqueName: \"kubernetes.io/projected/0c629dda-d564-48c2-9f82-fe5d893bcf53-kube-api-access-rmqss\") on node \"crc\" DevicePath \"\"" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.738756 4681 generic.go:334] "Generic (PLEG): container finished" podID="0c629dda-d564-48c2-9f82-fe5d893bcf53" containerID="2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c" exitCode=0 Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.738818 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-m5k87" event={"ID":"0c629dda-d564-48c2-9f82-fe5d893bcf53","Type":"ContainerDied","Data":"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c"} Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.738868 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-m5k87" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.740390 4681 scope.go:117] "RemoveContainer" containerID="2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.740226 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-m5k87" event={"ID":"0c629dda-d564-48c2-9f82-fe5d893bcf53","Type":"ContainerDied","Data":"acd3e4be4170ea587e7f3aab054874132c456eaeaaf67e971c280b1a6ac9b870"} Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.764044 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.768338 4681 scope.go:117] "RemoveContainer" containerID="2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c" Jan 22 09:46:51 crc kubenswrapper[4681]: E0122 09:46:51.769257 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c\": container with ID starting with 2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c not found: ID does not exist" containerID="2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.769528 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c"} err="failed to get container status \"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c\": rpc error: code = NotFound desc = could not find container \"2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c\": container with ID starting with 2f8c3f62d600cac1abc241ed18a2c52e8ce0f1a892d91e137a37fbb60cf8540c not found: ID does not exist" Jan 22 09:46:51 crc kubenswrapper[4681]: I0122 09:46:51.775754 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-m5k87"] Jan 22 09:46:53 crc kubenswrapper[4681]: I0122 09:46:53.471316 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c629dda-d564-48c2-9f82-fe5d893bcf53" path="/var/lib/kubelet/pods/0c629dda-d564-48c2-9f82-fe5d893bcf53/volumes" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.880793 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:14 crc kubenswrapper[4681]: E0122 09:47:14.881779 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c629dda-d564-48c2-9f82-fe5d893bcf53" containerName="registry-server" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.881794 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c629dda-d564-48c2-9f82-fe5d893bcf53" containerName="registry-server" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.881947 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c629dda-d564-48c2-9f82-fe5d893bcf53" containerName="registry-server" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.883055 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.900344 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.951006 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.951082 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6p9d\" (UniqueName: \"kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:14 crc kubenswrapper[4681]: I0122 09:47:14.951134 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.052459 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6p9d\" (UniqueName: \"kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.052568 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.052639 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.053220 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.053344 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.081096 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6p9d\" (UniqueName: \"kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d\") pod \"community-operators-wbzfb\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.201521 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.481247 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.964117 4681 generic.go:334] "Generic (PLEG): container finished" podID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerID="38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace" exitCode=0 Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.964165 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerDied","Data":"38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace"} Jan 22 09:47:15 crc kubenswrapper[4681]: I0122 09:47:15.964192 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerStarted","Data":"480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e"} Jan 22 09:47:20 crc kubenswrapper[4681]: I0122 09:47:20.036826 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerDied","Data":"66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44"} Jan 22 09:47:20 crc kubenswrapper[4681]: I0122 09:47:20.037107 4681 generic.go:334] "Generic (PLEG): container finished" podID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerID="66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44" exitCode=0 Jan 22 09:47:22 crc kubenswrapper[4681]: I0122 09:47:22.075918 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerStarted","Data":"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26"} Jan 22 09:47:22 crc kubenswrapper[4681]: I0122 09:47:22.103870 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbzfb" podStartSLOduration=2.7809844310000003 podStartE2EDuration="8.10384863s" podCreationTimestamp="2026-01-22 09:47:14 +0000 UTC" firstStartedPulling="2026-01-22 09:47:15.966036303 +0000 UTC m=+2626.791946848" lastFinishedPulling="2026-01-22 09:47:21.288900492 +0000 UTC m=+2632.114811047" observedRunningTime="2026-01-22 09:47:22.100631275 +0000 UTC m=+2632.926541780" watchObservedRunningTime="2026-01-22 09:47:22.10384863 +0000 UTC m=+2632.929759135" Jan 22 09:47:25 crc kubenswrapper[4681]: I0122 09:47:25.202175 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:25 crc kubenswrapper[4681]: I0122 09:47:25.202732 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:25 crc kubenswrapper[4681]: I0122 09:47:25.259099 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:26 crc kubenswrapper[4681]: I0122 09:47:26.177746 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:26 crc kubenswrapper[4681]: I0122 09:47:26.245191 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:28 crc kubenswrapper[4681]: I0122 09:47:28.128830 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbzfb" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="registry-server" containerID="cri-o://6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26" gracePeriod=2 Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.083695 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.156913 4681 generic.go:334] "Generic (PLEG): container finished" podID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerID="6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26" exitCode=0 Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.156975 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerDied","Data":"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26"} Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.157007 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbzfb" event={"ID":"e3ca7794-33e9-4813-b336-2e7b49642be0","Type":"ContainerDied","Data":"480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e"} Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.157029 4681 scope.go:117] "RemoveContainer" containerID="6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.157210 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbzfb" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.175229 4681 scope.go:117] "RemoveContainer" containerID="66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.197998 4681 scope.go:117] "RemoveContainer" containerID="38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.235545 4681 scope.go:117] "RemoveContainer" containerID="6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26" Jan 22 09:47:29 crc kubenswrapper[4681]: E0122 09:47:29.236161 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26\": container with ID starting with 6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26 not found: ID does not exist" containerID="6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.236217 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26"} err="failed to get container status \"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26\": rpc error: code = NotFound desc = could not find container \"6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26\": container with ID starting with 6fb6bc2bcab55ccb72b0219b4ee4ca55ac2dd14ed09a500f3cca940cf3fcac26 not found: ID does not exist" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.236254 4681 scope.go:117] "RemoveContainer" containerID="66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44" Jan 22 09:47:29 crc kubenswrapper[4681]: E0122 09:47:29.236744 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44\": container with ID starting with 66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44 not found: ID does not exist" containerID="66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.236802 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44"} err="failed to get container status \"66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44\": rpc error: code = NotFound desc = could not find container \"66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44\": container with ID starting with 66c275bc3da0562f9f1af6ee67a5519b41c5609891106563e227577aaf750b44 not found: ID does not exist" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.236834 4681 scope.go:117] "RemoveContainer" containerID="38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace" Jan 22 09:47:29 crc kubenswrapper[4681]: E0122 09:47:29.237492 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace\": container with ID starting with 38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace not found: ID does not exist" containerID="38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.237529 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace"} err="failed to get container status \"38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace\": rpc error: code = NotFound desc = could not find container \"38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace\": container with ID starting with 38be4591dcdddb4540ebe7de1be72712d306d216c3484bf8388335b8ea80aace not found: ID does not exist" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.282357 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content\") pod \"e3ca7794-33e9-4813-b336-2e7b49642be0\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.282545 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities\") pod \"e3ca7794-33e9-4813-b336-2e7b49642be0\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.282585 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6p9d\" (UniqueName: \"kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d\") pod \"e3ca7794-33e9-4813-b336-2e7b49642be0\" (UID: \"e3ca7794-33e9-4813-b336-2e7b49642be0\") " Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.284411 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities" (OuterVolumeSpecName: "utilities") pod "e3ca7794-33e9-4813-b336-2e7b49642be0" (UID: "e3ca7794-33e9-4813-b336-2e7b49642be0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.293138 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d" (OuterVolumeSpecName: "kube-api-access-x6p9d") pod "e3ca7794-33e9-4813-b336-2e7b49642be0" (UID: "e3ca7794-33e9-4813-b336-2e7b49642be0"). InnerVolumeSpecName "kube-api-access-x6p9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.371962 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3ca7794-33e9-4813-b336-2e7b49642be0" (UID: "e3ca7794-33e9-4813-b336-2e7b49642be0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.384608 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.384666 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3ca7794-33e9-4813-b336-2e7b49642be0-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.384688 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6p9d\" (UniqueName: \"kubernetes.io/projected/e3ca7794-33e9-4813-b336-2e7b49642be0-kube-api-access-x6p9d\") on node \"crc\" DevicePath \"\"" Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.518734 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:29 crc kubenswrapper[4681]: I0122 09:47:29.528282 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbzfb"] Jan 22 09:47:31 crc kubenswrapper[4681]: I0122 09:47:31.470056 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" path="/var/lib/kubelet/pods/e3ca7794-33e9-4813-b336-2e7b49642be0/volumes" Jan 22 09:47:34 crc kubenswrapper[4681]: E0122 09:47:34.119859 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache]" Jan 22 09:47:44 crc kubenswrapper[4681]: E0122 09:47:44.293453 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache]" Jan 22 09:47:54 crc kubenswrapper[4681]: E0122 09:47:54.482587 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache]" Jan 22 09:48:04 crc kubenswrapper[4681]: E0122 09:48:04.621207 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache]" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.760952 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:12 crc kubenswrapper[4681]: E0122 09:48:12.762253 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="registry-server" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.762305 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="registry-server" Jan 22 09:48:12 crc kubenswrapper[4681]: E0122 09:48:12.762341 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="extract-utilities" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.762351 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="extract-utilities" Jan 22 09:48:12 crc kubenswrapper[4681]: E0122 09:48:12.762379 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="extract-content" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.762387 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="extract-content" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.762591 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ca7794-33e9-4813-b336-2e7b49642be0" containerName="registry-server" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.763795 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.788795 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.943982 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.944074 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqfbh\" (UniqueName: \"kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:12 crc kubenswrapper[4681]: I0122 09:48:12.944114 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.046073 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.046200 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.046336 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqfbh\" (UniqueName: \"kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.046852 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.046976 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.070533 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqfbh\" (UniqueName: \"kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh\") pod \"redhat-operators-p6gjl\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.104599 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.517681 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:13 crc kubenswrapper[4681]: I0122 09:48:13.535624 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerStarted","Data":"688304fa748e9cbb80c71b26c3ba4a918e5c205d2141ae67577522355202a4e0"} Jan 22 09:48:14 crc kubenswrapper[4681]: I0122 09:48:14.545556 4681 generic.go:334] "Generic (PLEG): container finished" podID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerID="0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937" exitCode=0 Jan 22 09:48:14 crc kubenswrapper[4681]: I0122 09:48:14.545844 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerDied","Data":"0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937"} Jan 22 09:48:14 crc kubenswrapper[4681]: E0122 09:48:14.878434 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache]" Jan 22 09:48:15 crc kubenswrapper[4681]: I0122 09:48:15.554403 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerStarted","Data":"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600"} Jan 22 09:48:17 crc kubenswrapper[4681]: I0122 09:48:17.571680 4681 generic.go:334] "Generic (PLEG): container finished" podID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerID="0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600" exitCode=0 Jan 22 09:48:17 crc kubenswrapper[4681]: I0122 09:48:17.571804 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerDied","Data":"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600"} Jan 22 09:48:21 crc kubenswrapper[4681]: I0122 09:48:21.701666 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerStarted","Data":"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08"} Jan 22 09:48:21 crc kubenswrapper[4681]: I0122 09:48:21.726930 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6gjl" podStartSLOduration=3.041093712 podStartE2EDuration="9.726910159s" podCreationTimestamp="2026-01-22 09:48:12 +0000 UTC" firstStartedPulling="2026-01-22 09:48:14.548956622 +0000 UTC m=+2685.374867167" lastFinishedPulling="2026-01-22 09:48:21.234773109 +0000 UTC m=+2692.060683614" observedRunningTime="2026-01-22 09:48:21.72494965 +0000 UTC m=+2692.550860175" watchObservedRunningTime="2026-01-22 09:48:21.726910159 +0000 UTC m=+2692.552820664" Jan 22 09:48:23 crc kubenswrapper[4681]: I0122 09:48:23.105612 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:23 crc kubenswrapper[4681]: I0122 09:48:23.105690 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:24 crc kubenswrapper[4681]: I0122 09:48:24.166646 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6gjl" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="registry-server" probeResult="failure" output=< Jan 22 09:48:24 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:48:24 crc kubenswrapper[4681]: > Jan 22 09:48:25 crc kubenswrapper[4681]: E0122 09:48:25.044569 4681 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ca7794_33e9_4813_b336_2e7b49642be0.slice/crio-480a0ea22885cf1a160539f101e6ff67f823899a64a9e410be3779bc7b5b508e\": RecentStats: unable to find data in memory cache]" Jan 22 09:48:33 crc kubenswrapper[4681]: I0122 09:48:33.171448 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:33 crc kubenswrapper[4681]: I0122 09:48:33.219405 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:33 crc kubenswrapper[4681]: I0122 09:48:33.422455 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:34 crc kubenswrapper[4681]: I0122 09:48:34.802056 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6gjl" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="registry-server" containerID="cri-o://1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08" gracePeriod=2 Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.190043 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.301766 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content\") pod \"599005da-d668-43d5-bd90-c4a52ea23ec9\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.302180 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqfbh\" (UniqueName: \"kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh\") pod \"599005da-d668-43d5-bd90-c4a52ea23ec9\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.302406 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities\") pod \"599005da-d668-43d5-bd90-c4a52ea23ec9\" (UID: \"599005da-d668-43d5-bd90-c4a52ea23ec9\") " Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.303152 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities" (OuterVolumeSpecName: "utilities") pod "599005da-d668-43d5-bd90-c4a52ea23ec9" (UID: "599005da-d668-43d5-bd90-c4a52ea23ec9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.310041 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh" (OuterVolumeSpecName: "kube-api-access-mqfbh") pod "599005da-d668-43d5-bd90-c4a52ea23ec9" (UID: "599005da-d668-43d5-bd90-c4a52ea23ec9"). InnerVolumeSpecName "kube-api-access-mqfbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.404817 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqfbh\" (UniqueName: \"kubernetes.io/projected/599005da-d668-43d5-bd90-c4a52ea23ec9-kube-api-access-mqfbh\") on node \"crc\" DevicePath \"\"" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.405088 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.441553 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "599005da-d668-43d5-bd90-c4a52ea23ec9" (UID: "599005da-d668-43d5-bd90-c4a52ea23ec9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.506645 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599005da-d668-43d5-bd90-c4a52ea23ec9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.814816 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6gjl" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.815036 4681 generic.go:334] "Generic (PLEG): container finished" podID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerID="1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08" exitCode=0 Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.815100 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerDied","Data":"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08"} Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.815148 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6gjl" event={"ID":"599005da-d668-43d5-bd90-c4a52ea23ec9","Type":"ContainerDied","Data":"688304fa748e9cbb80c71b26c3ba4a918e5c205d2141ae67577522355202a4e0"} Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.816603 4681 scope.go:117] "RemoveContainer" containerID="1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.853620 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.859083 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6gjl"] Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.860479 4681 scope.go:117] "RemoveContainer" containerID="0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.880357 4681 scope.go:117] "RemoveContainer" containerID="0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.903533 4681 scope.go:117] "RemoveContainer" containerID="1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08" Jan 22 09:48:35 crc kubenswrapper[4681]: E0122 09:48:35.903953 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08\": container with ID starting with 1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08 not found: ID does not exist" containerID="1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.904025 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08"} err="failed to get container status \"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08\": rpc error: code = NotFound desc = could not find container \"1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08\": container with ID starting with 1fb15bd6b02df5a7bef8db3d1d92c9a10b5b81c713356f3d0eeaf3c30c213b08 not found: ID does not exist" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.904049 4681 scope.go:117] "RemoveContainer" containerID="0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600" Jan 22 09:48:35 crc kubenswrapper[4681]: E0122 09:48:35.904518 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600\": container with ID starting with 0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600 not found: ID does not exist" containerID="0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.904539 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600"} err="failed to get container status \"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600\": rpc error: code = NotFound desc = could not find container \"0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600\": container with ID starting with 0564cc65047cb6afbd951e76c55664b5d3f3b4bc4e07543293b5a7eb17cac600 not found: ID does not exist" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.904553 4681 scope.go:117] "RemoveContainer" containerID="0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937" Jan 22 09:48:35 crc kubenswrapper[4681]: E0122 09:48:35.904983 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937\": container with ID starting with 0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937 not found: ID does not exist" containerID="0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937" Jan 22 09:48:35 crc kubenswrapper[4681]: I0122 09:48:35.905005 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937"} err="failed to get container status \"0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937\": rpc error: code = NotFound desc = could not find container \"0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937\": container with ID starting with 0e56d0f6c3bc846211baa125b105cb96035d0d90ff72aa8c5c51c54bec794937 not found: ID does not exist" Jan 22 09:48:37 crc kubenswrapper[4681]: I0122 09:48:37.468234 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" path="/var/lib/kubelet/pods/599005da-d668-43d5-bd90-c4a52ea23ec9/volumes" Jan 22 09:48:56 crc kubenswrapper[4681]: I0122 09:48:56.031655 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:48:56 crc kubenswrapper[4681]: I0122 09:48:56.033893 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:49:26 crc kubenswrapper[4681]: I0122 09:49:26.031576 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:49:26 crc kubenswrapper[4681]: I0122 09:49:26.032088 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.031603 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.032188 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.032245 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.032981 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.033061 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b" gracePeriod=600 Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.487363 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b" exitCode=0 Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.487832 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b"} Jan 22 09:49:56 crc kubenswrapper[4681]: I0122 09:49:56.487901 4681 scope.go:117] "RemoveContainer" containerID="05aca758ed8176914669ecfb9a78b0dfa0cdffb0b0f7c7201ddca64397b9eecd" Jan 22 09:49:57 crc kubenswrapper[4681]: I0122 09:49:57.524529 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0"} Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.323889 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:51:54 crc kubenswrapper[4681]: E0122 09:51:54.325005 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="registry-server" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.325019 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="registry-server" Jan 22 09:51:54 crc kubenswrapper[4681]: E0122 09:51:54.325033 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="extract-content" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.325039 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="extract-content" Jan 22 09:51:54 crc kubenswrapper[4681]: E0122 09:51:54.325054 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="extract-utilities" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.325061 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="extract-utilities" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.325177 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="599005da-d668-43d5-bd90-c4a52ea23ec9" containerName="registry-server" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.325663 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.340935 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.441038 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkq6l\" (UniqueName: \"kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l\") pod \"service-telemetry-framework-operators-fnfxq\" (UID: \"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8\") " pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.542325 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkq6l\" (UniqueName: \"kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l\") pod \"service-telemetry-framework-operators-fnfxq\" (UID: \"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8\") " pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.586609 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkq6l\" (UniqueName: \"kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l\") pod \"service-telemetry-framework-operators-fnfxq\" (UID: \"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8\") " pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.654352 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.897745 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:51:54 crc kubenswrapper[4681]: I0122 09:51:54.903675 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:51:55 crc kubenswrapper[4681]: I0122 09:51:55.576247 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" event={"ID":"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8","Type":"ContainerStarted","Data":"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64"} Jan 22 09:51:55 crc kubenswrapper[4681]: I0122 09:51:55.576627 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" event={"ID":"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8","Type":"ContainerStarted","Data":"a043f1cc8d79819ff3ee6bfe01b65fbfaa193c96563f6bf4c1ed153c2eb14425"} Jan 22 09:51:55 crc kubenswrapper[4681]: I0122 09:51:55.607283 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" podStartSLOduration=1.487687182 podStartE2EDuration="1.607227543s" podCreationTimestamp="2026-01-22 09:51:54 +0000 UTC" firstStartedPulling="2026-01-22 09:51:54.903389199 +0000 UTC m=+2905.729299724" lastFinishedPulling="2026-01-22 09:51:55.02292954 +0000 UTC m=+2905.848840085" observedRunningTime="2026-01-22 09:51:55.594009534 +0000 UTC m=+2906.419920039" watchObservedRunningTime="2026-01-22 09:51:55.607227543 +0000 UTC m=+2906.433138048" Jan 22 09:51:56 crc kubenswrapper[4681]: I0122 09:51:56.031677 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:51:56 crc kubenswrapper[4681]: I0122 09:51:56.031752 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:52:04 crc kubenswrapper[4681]: I0122 09:52:04.663914 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:04 crc kubenswrapper[4681]: I0122 09:52:04.664576 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:04 crc kubenswrapper[4681]: I0122 09:52:04.707827 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:05 crc kubenswrapper[4681]: I0122 09:52:05.688474 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:05 crc kubenswrapper[4681]: I0122 09:52:05.738208 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:52:07 crc kubenswrapper[4681]: I0122 09:52:07.677722 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" podUID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" containerName="registry-server" containerID="cri-o://79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64" gracePeriod=2 Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.595582 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.687064 4681 generic.go:334] "Generic (PLEG): container finished" podID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" containerID="79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64" exitCode=0 Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.687124 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.687119 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" event={"ID":"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8","Type":"ContainerDied","Data":"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64"} Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.687185 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-fnfxq" event={"ID":"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8","Type":"ContainerDied","Data":"a043f1cc8d79819ff3ee6bfe01b65fbfaa193c96563f6bf4c1ed153c2eb14425"} Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.687218 4681 scope.go:117] "RemoveContainer" containerID="79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.704914 4681 scope.go:117] "RemoveContainer" containerID="79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64" Jan 22 09:52:08 crc kubenswrapper[4681]: E0122 09:52:08.705654 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64\": container with ID starting with 79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64 not found: ID does not exist" containerID="79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.705696 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64"} err="failed to get container status \"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64\": rpc error: code = NotFound desc = could not find container \"79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64\": container with ID starting with 79764efb7617b5f4fb93280b8b2ada81d50229c2cf1354b7e3a0a4bf23515c64 not found: ID does not exist" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.779641 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkq6l\" (UniqueName: \"kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l\") pod \"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8\" (UID: \"a5abaeb7-9ea4-46ec-afbb-e30c277e59a8\") " Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.787585 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l" (OuterVolumeSpecName: "kube-api-access-kkq6l") pod "a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" (UID: "a5abaeb7-9ea4-46ec-afbb-e30c277e59a8"). InnerVolumeSpecName "kube-api-access-kkq6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:52:08 crc kubenswrapper[4681]: I0122 09:52:08.882088 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkq6l\" (UniqueName: \"kubernetes.io/projected/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8-kube-api-access-kkq6l\") on node \"crc\" DevicePath \"\"" Jan 22 09:52:09 crc kubenswrapper[4681]: I0122 09:52:09.024397 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:52:09 crc kubenswrapper[4681]: I0122 09:52:09.031050 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-fnfxq"] Jan 22 09:52:09 crc kubenswrapper[4681]: I0122 09:52:09.468612 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" path="/var/lib/kubelet/pods/a5abaeb7-9ea4-46ec-afbb-e30c277e59a8/volumes" Jan 22 09:52:26 crc kubenswrapper[4681]: I0122 09:52:26.030919 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:52:26 crc kubenswrapper[4681]: I0122 09:52:26.031639 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:52:56 crc kubenswrapper[4681]: I0122 09:52:56.031166 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 09:52:56 crc kubenswrapper[4681]: I0122 09:52:56.032406 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 09:52:56 crc kubenswrapper[4681]: I0122 09:52:56.032482 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 09:52:56 crc kubenswrapper[4681]: I0122 09:52:56.033238 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 09:52:56 crc kubenswrapper[4681]: I0122 09:52:56.033379 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" gracePeriod=600 Jan 22 09:52:56 crc kubenswrapper[4681]: E0122 09:52:56.210397 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:52:57 crc kubenswrapper[4681]: I0122 09:52:57.083830 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" exitCode=0 Jan 22 09:52:57 crc kubenswrapper[4681]: I0122 09:52:57.084369 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0"} Jan 22 09:52:57 crc kubenswrapper[4681]: I0122 09:52:57.084461 4681 scope.go:117] "RemoveContainer" containerID="37fd8c7821419c43a5ef07c93ba7768e09046abcdf54d1fa9054ef3d7656703b" Jan 22 09:52:57 crc kubenswrapper[4681]: I0122 09:52:57.084982 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:52:57 crc kubenswrapper[4681]: E0122 09:52:57.085275 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:53:07 crc kubenswrapper[4681]: I0122 09:53:07.453051 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:53:07 crc kubenswrapper[4681]: E0122 09:53:07.453841 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:53:21 crc kubenswrapper[4681]: I0122 09:53:21.452628 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:53:21 crc kubenswrapper[4681]: E0122 09:53:21.455857 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.242139 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:30 crc kubenswrapper[4681]: E0122 09:53:30.243255 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" containerName="registry-server" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.243286 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" containerName="registry-server" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.243396 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5abaeb7-9ea4-46ec-afbb-e30c277e59a8" containerName="registry-server" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.244325 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.275917 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.374448 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.374720 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.374843 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb4rk\" (UniqueName: \"kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.476011 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb4rk\" (UniqueName: \"kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.476359 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.476475 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.477052 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.478078 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.512932 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb4rk\" (UniqueName: \"kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk\") pod \"certified-operators-wksrz\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:30 crc kubenswrapper[4681]: I0122 09:53:30.595486 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:31 crc kubenswrapper[4681]: I0122 09:53:31.032885 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:31 crc kubenswrapper[4681]: I0122 09:53:31.373760 4681 generic.go:334] "Generic (PLEG): container finished" podID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerID="cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e" exitCode=0 Jan 22 09:53:31 crc kubenswrapper[4681]: I0122 09:53:31.373807 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerDied","Data":"cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e"} Jan 22 09:53:31 crc kubenswrapper[4681]: I0122 09:53:31.373845 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerStarted","Data":"b7a7259467c5b527a7ec55f95fe0ca376bac9ac6f811244f317cf949c5ebb567"} Jan 22 09:53:32 crc kubenswrapper[4681]: I0122 09:53:32.385564 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerStarted","Data":"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d"} Jan 22 09:53:33 crc kubenswrapper[4681]: I0122 09:53:33.395088 4681 generic.go:334] "Generic (PLEG): container finished" podID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerID="900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d" exitCode=0 Jan 22 09:53:33 crc kubenswrapper[4681]: I0122 09:53:33.395145 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerDied","Data":"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d"} Jan 22 09:53:34 crc kubenswrapper[4681]: I0122 09:53:34.409509 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerStarted","Data":"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215"} Jan 22 09:53:34 crc kubenswrapper[4681]: I0122 09:53:34.441566 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wksrz" podStartSLOduration=2.011234566 podStartE2EDuration="4.441459157s" podCreationTimestamp="2026-01-22 09:53:30 +0000 UTC" firstStartedPulling="2026-01-22 09:53:31.375412651 +0000 UTC m=+3002.201323176" lastFinishedPulling="2026-01-22 09:53:33.805637222 +0000 UTC m=+3004.631547767" observedRunningTime="2026-01-22 09:53:34.439987298 +0000 UTC m=+3005.265897843" watchObservedRunningTime="2026-01-22 09:53:34.441459157 +0000 UTC m=+3005.267369672" Jan 22 09:53:34 crc kubenswrapper[4681]: I0122 09:53:34.452165 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:53:34 crc kubenswrapper[4681]: E0122 09:53:34.452505 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:53:40 crc kubenswrapper[4681]: I0122 09:53:40.596305 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:40 crc kubenswrapper[4681]: I0122 09:53:40.596914 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:40 crc kubenswrapper[4681]: I0122 09:53:40.668626 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:41 crc kubenswrapper[4681]: I0122 09:53:41.539293 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:41 crc kubenswrapper[4681]: I0122 09:53:41.606576 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:43 crc kubenswrapper[4681]: I0122 09:53:43.489311 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wksrz" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="registry-server" containerID="cri-o://4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215" gracePeriod=2 Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.470118 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.501914 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content\") pod \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.502001 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities\") pod \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.502044 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb4rk\" (UniqueName: \"kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk\") pod \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\" (UID: \"cb9d6ca2-040a-466e-ae73-d74edb27a4a3\") " Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.505694 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities" (OuterVolumeSpecName: "utilities") pod "cb9d6ca2-040a-466e-ae73-d74edb27a4a3" (UID: "cb9d6ca2-040a-466e-ae73-d74edb27a4a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.509049 4681 generic.go:334] "Generic (PLEG): container finished" podID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerID="4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215" exitCode=0 Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.509100 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerDied","Data":"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215"} Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.509134 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wksrz" event={"ID":"cb9d6ca2-040a-466e-ae73-d74edb27a4a3","Type":"ContainerDied","Data":"b7a7259467c5b527a7ec55f95fe0ca376bac9ac6f811244f317cf949c5ebb567"} Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.509170 4681 scope.go:117] "RemoveContainer" containerID="4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.509536 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wksrz" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.529906 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk" (OuterVolumeSpecName: "kube-api-access-tb4rk") pod "cb9d6ca2-040a-466e-ae73-d74edb27a4a3" (UID: "cb9d6ca2-040a-466e-ae73-d74edb27a4a3"). InnerVolumeSpecName "kube-api-access-tb4rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.567702 4681 scope.go:117] "RemoveContainer" containerID="900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.573436 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb9d6ca2-040a-466e-ae73-d74edb27a4a3" (UID: "cb9d6ca2-040a-466e-ae73-d74edb27a4a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.592614 4681 scope.go:117] "RemoveContainer" containerID="cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.604476 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.604612 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.604681 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb4rk\" (UniqueName: \"kubernetes.io/projected/cb9d6ca2-040a-466e-ae73-d74edb27a4a3-kube-api-access-tb4rk\") on node \"crc\" DevicePath \"\"" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.617961 4681 scope.go:117] "RemoveContainer" containerID="4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215" Jan 22 09:53:44 crc kubenswrapper[4681]: E0122 09:53:44.618967 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215\": container with ID starting with 4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215 not found: ID does not exist" containerID="4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.619113 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215"} err="failed to get container status \"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215\": rpc error: code = NotFound desc = could not find container \"4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215\": container with ID starting with 4b2e01aac3d0e31a9c96c9f327b4c595fa61cbfdf9e86f9161ee89c817fb8215 not found: ID does not exist" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.619222 4681 scope.go:117] "RemoveContainer" containerID="900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d" Jan 22 09:53:44 crc kubenswrapper[4681]: E0122 09:53:44.620012 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d\": container with ID starting with 900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d not found: ID does not exist" containerID="900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.620120 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d"} err="failed to get container status \"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d\": rpc error: code = NotFound desc = could not find container \"900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d\": container with ID starting with 900c4b0aad3b72079e285288140581e827171c70dcd23b6faabf0032da56ed4d not found: ID does not exist" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.620175 4681 scope.go:117] "RemoveContainer" containerID="cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e" Jan 22 09:53:44 crc kubenswrapper[4681]: E0122 09:53:44.620653 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e\": container with ID starting with cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e not found: ID does not exist" containerID="cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.620753 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e"} err="failed to get container status \"cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e\": rpc error: code = NotFound desc = could not find container \"cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e\": container with ID starting with cc2db6bcf3b4bd1d507ffc3e27796eaa8a4d0656158c278dafd491a4917f193e not found: ID does not exist" Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.853539 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:44 crc kubenswrapper[4681]: I0122 09:53:44.863872 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wksrz"] Jan 22 09:53:45 crc kubenswrapper[4681]: I0122 09:53:45.468529 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" path="/var/lib/kubelet/pods/cb9d6ca2-040a-466e-ae73-d74edb27a4a3/volumes" Jan 22 09:53:47 crc kubenswrapper[4681]: I0122 09:53:47.452887 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:53:47 crc kubenswrapper[4681]: E0122 09:53:47.453635 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:54:02 crc kubenswrapper[4681]: I0122 09:54:02.452738 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:54:02 crc kubenswrapper[4681]: E0122 09:54:02.453434 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:54:17 crc kubenswrapper[4681]: I0122 09:54:17.455513 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:54:17 crc kubenswrapper[4681]: E0122 09:54:17.456360 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:54:28 crc kubenswrapper[4681]: I0122 09:54:28.453373 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:54:28 crc kubenswrapper[4681]: E0122 09:54:28.454444 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:54:39 crc kubenswrapper[4681]: I0122 09:54:39.459298 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:54:39 crc kubenswrapper[4681]: E0122 09:54:39.460095 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:54:52 crc kubenswrapper[4681]: I0122 09:54:52.452908 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:54:52 crc kubenswrapper[4681]: E0122 09:54:52.455580 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:55:03 crc kubenswrapper[4681]: I0122 09:55:03.452590 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:55:03 crc kubenswrapper[4681]: E0122 09:55:03.453329 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:55:14 crc kubenswrapper[4681]: I0122 09:55:14.452964 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:55:14 crc kubenswrapper[4681]: E0122 09:55:14.454013 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:55:29 crc kubenswrapper[4681]: I0122 09:55:29.463453 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:55:29 crc kubenswrapper[4681]: E0122 09:55:29.464627 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:55:43 crc kubenswrapper[4681]: I0122 09:55:43.455350 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:55:43 crc kubenswrapper[4681]: E0122 09:55:43.456546 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:55:57 crc kubenswrapper[4681]: I0122 09:55:57.455486 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:55:57 crc kubenswrapper[4681]: E0122 09:55:57.456738 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:56:12 crc kubenswrapper[4681]: I0122 09:56:12.455926 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:56:12 crc kubenswrapper[4681]: E0122 09:56:12.457198 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:56:26 crc kubenswrapper[4681]: I0122 09:56:26.452994 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:56:26 crc kubenswrapper[4681]: E0122 09:56:26.453512 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:56:39 crc kubenswrapper[4681]: I0122 09:56:39.459595 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:56:39 crc kubenswrapper[4681]: E0122 09:56:39.460714 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:56:50 crc kubenswrapper[4681]: I0122 09:56:50.452487 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:56:50 crc kubenswrapper[4681]: E0122 09:56:50.453579 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:05 crc kubenswrapper[4681]: I0122 09:57:05.454563 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:57:05 crc kubenswrapper[4681]: E0122 09:57:05.455794 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:16 crc kubenswrapper[4681]: I0122 09:57:16.453131 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:57:16 crc kubenswrapper[4681]: E0122 09:57:16.453740 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:27 crc kubenswrapper[4681]: I0122 09:57:27.452990 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:57:27 crc kubenswrapper[4681]: E0122 09:57:27.453880 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.227090 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:34 crc kubenswrapper[4681]: E0122 09:57:34.227982 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="extract-content" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.227999 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="extract-content" Jan 22 09:57:34 crc kubenswrapper[4681]: E0122 09:57:34.228017 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="registry-server" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.228025 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="registry-server" Jan 22 09:57:34 crc kubenswrapper[4681]: E0122 09:57:34.228038 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="extract-utilities" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.228046 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="extract-utilities" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.228200 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9d6ca2-040a-466e-ae73-d74edb27a4a3" containerName="registry-server" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.229167 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.236172 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.321651 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6hm\" (UniqueName: \"kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.321685 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.321742 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.423033 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.423160 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6hm\" (UniqueName: \"kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.423186 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.423490 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.423576 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.447807 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6hm\" (UniqueName: \"kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm\") pod \"community-operators-x99fx\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.589593 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:34 crc kubenswrapper[4681]: I0122 09:57:34.860976 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:35 crc kubenswrapper[4681]: I0122 09:57:35.447205 4681 generic.go:334] "Generic (PLEG): container finished" podID="77454808-a126-4f0e-9daf-d9e1effcb003" containerID="effeb63d7bb4ba0db58d057a1e69cf820e86743d1bc286c3a75bf6d97abdc9cb" exitCode=0 Jan 22 09:57:35 crc kubenswrapper[4681]: I0122 09:57:35.447287 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerDied","Data":"effeb63d7bb4ba0db58d057a1e69cf820e86743d1bc286c3a75bf6d97abdc9cb"} Jan 22 09:57:35 crc kubenswrapper[4681]: I0122 09:57:35.447328 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerStarted","Data":"5e12836ad26852378a6d5a0cf240672cffe2856ee6623213caf6231cb984e2d7"} Jan 22 09:57:35 crc kubenswrapper[4681]: I0122 09:57:35.449922 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 09:57:36 crc kubenswrapper[4681]: I0122 09:57:36.453823 4681 generic.go:334] "Generic (PLEG): container finished" podID="77454808-a126-4f0e-9daf-d9e1effcb003" containerID="1dc2f67cbf804a8edae21e5ad95b4c82fec24e9b739d4ad395a28478b004c4f4" exitCode=0 Jan 22 09:57:36 crc kubenswrapper[4681]: I0122 09:57:36.453888 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerDied","Data":"1dc2f67cbf804a8edae21e5ad95b4c82fec24e9b739d4ad395a28478b004c4f4"} Jan 22 09:57:37 crc kubenswrapper[4681]: I0122 09:57:37.464919 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerStarted","Data":"3e15a9e79a3cce5ab63f4fd1e20b8d917e97cc99c4ea8b7d0244044ae6eb28d8"} Jan 22 09:57:40 crc kubenswrapper[4681]: I0122 09:57:40.789475 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x99fx" podStartSLOduration=5.398228606 podStartE2EDuration="6.789455793s" podCreationTimestamp="2026-01-22 09:57:34 +0000 UTC" firstStartedPulling="2026-01-22 09:57:35.44968541 +0000 UTC m=+3246.275595915" lastFinishedPulling="2026-01-22 09:57:36.840912597 +0000 UTC m=+3247.666823102" observedRunningTime="2026-01-22 09:57:37.502754962 +0000 UTC m=+3248.328665477" watchObservedRunningTime="2026-01-22 09:57:40.789455793 +0000 UTC m=+3251.615366308" Jan 22 09:57:40 crc kubenswrapper[4681]: I0122 09:57:40.794389 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:40 crc kubenswrapper[4681]: I0122 09:57:40.797237 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:40 crc kubenswrapper[4681]: I0122 09:57:40.801713 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:40 crc kubenswrapper[4681]: I0122 09:57:40.918776 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwdl\" (UniqueName: \"kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl\") pod \"service-telemetry-framework-operators-8ms88\" (UID: \"d242700b-26b4-4b83-83d3-1a49852f6cd5\") " pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.022247 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwdl\" (UniqueName: \"kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl\") pod \"service-telemetry-framework-operators-8ms88\" (UID: \"d242700b-26b4-4b83-83d3-1a49852f6cd5\") " pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.048087 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwdl\" (UniqueName: \"kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl\") pod \"service-telemetry-framework-operators-8ms88\" (UID: \"d242700b-26b4-4b83-83d3-1a49852f6cd5\") " pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.118051 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.379927 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:41 crc kubenswrapper[4681]: W0122 09:57:41.384100 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd242700b_26b4_4b83_83d3_1a49852f6cd5.slice/crio-ff87d21678e10da3d1da43e97c1ce20205d4f477f64fce42bc5eda5ee2dd042c WatchSource:0}: Error finding container ff87d21678e10da3d1da43e97c1ce20205d4f477f64fce42bc5eda5ee2dd042c: Status 404 returned error can't find the container with id ff87d21678e10da3d1da43e97c1ce20205d4f477f64fce42bc5eda5ee2dd042c Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.455961 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:57:41 crc kubenswrapper[4681]: E0122 09:57:41.456237 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:41 crc kubenswrapper[4681]: I0122 09:57:41.497883 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-8ms88" event={"ID":"d242700b-26b4-4b83-83d3-1a49852f6cd5","Type":"ContainerStarted","Data":"ff87d21678e10da3d1da43e97c1ce20205d4f477f64fce42bc5eda5ee2dd042c"} Jan 22 09:57:42 crc kubenswrapper[4681]: I0122 09:57:42.506119 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-8ms88" event={"ID":"d242700b-26b4-4b83-83d3-1a49852f6cd5","Type":"ContainerStarted","Data":"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0"} Jan 22 09:57:42 crc kubenswrapper[4681]: I0122 09:57:42.529141 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-8ms88" podStartSLOduration=2.257086007 podStartE2EDuration="2.529120742s" podCreationTimestamp="2026-01-22 09:57:40 +0000 UTC" firstStartedPulling="2026-01-22 09:57:41.3864143 +0000 UTC m=+3252.212324815" lastFinishedPulling="2026-01-22 09:57:41.658449015 +0000 UTC m=+3252.484359550" observedRunningTime="2026-01-22 09:57:42.523570787 +0000 UTC m=+3253.349481292" watchObservedRunningTime="2026-01-22 09:57:42.529120742 +0000 UTC m=+3253.355031257" Jan 22 09:57:44 crc kubenswrapper[4681]: I0122 09:57:44.590508 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:44 crc kubenswrapper[4681]: I0122 09:57:44.590795 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:44 crc kubenswrapper[4681]: I0122 09:57:44.627213 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:45 crc kubenswrapper[4681]: I0122 09:57:45.600367 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:50 crc kubenswrapper[4681]: I0122 09:57:50.582018 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:50 crc kubenswrapper[4681]: I0122 09:57:50.583068 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x99fx" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="registry-server" containerID="cri-o://3e15a9e79a3cce5ab63f4fd1e20b8d917e97cc99c4ea8b7d0244044ae6eb28d8" gracePeriod=2 Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.118461 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.119444 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.151981 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.575275 4681 generic.go:334] "Generic (PLEG): container finished" podID="77454808-a126-4f0e-9daf-d9e1effcb003" containerID="3e15a9e79a3cce5ab63f4fd1e20b8d917e97cc99c4ea8b7d0244044ae6eb28d8" exitCode=0 Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.575458 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerDied","Data":"3e15a9e79a3cce5ab63f4fd1e20b8d917e97cc99c4ea8b7d0244044ae6eb28d8"} Jan 22 09:57:51 crc kubenswrapper[4681]: I0122 09:57:51.603841 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.073005 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.244970 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities\") pod \"77454808-a126-4f0e-9daf-d9e1effcb003\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.245063 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6hm\" (UniqueName: \"kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm\") pod \"77454808-a126-4f0e-9daf-d9e1effcb003\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.245111 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content\") pod \"77454808-a126-4f0e-9daf-d9e1effcb003\" (UID: \"77454808-a126-4f0e-9daf-d9e1effcb003\") " Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.251308 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities" (OuterVolumeSpecName: "utilities") pod "77454808-a126-4f0e-9daf-d9e1effcb003" (UID: "77454808-a126-4f0e-9daf-d9e1effcb003"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.262401 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm" (OuterVolumeSpecName: "kube-api-access-bh6hm") pod "77454808-a126-4f0e-9daf-d9e1effcb003" (UID: "77454808-a126-4f0e-9daf-d9e1effcb003"). InnerVolumeSpecName "kube-api-access-bh6hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.297794 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77454808-a126-4f0e-9daf-d9e1effcb003" (UID: "77454808-a126-4f0e-9daf-d9e1effcb003"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.346499 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.346749 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77454808-a126-4f0e-9daf-d9e1effcb003-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.346808 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6hm\" (UniqueName: \"kubernetes.io/projected/77454808-a126-4f0e-9daf-d9e1effcb003-kube-api-access-bh6hm\") on node \"crc\" DevicePath \"\"" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.452551 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:57:52 crc kubenswrapper[4681]: E0122 09:57:52.453100 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.585768 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x99fx" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.594076 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x99fx" event={"ID":"77454808-a126-4f0e-9daf-d9e1effcb003","Type":"ContainerDied","Data":"5e12836ad26852378a6d5a0cf240672cffe2856ee6623213caf6231cb984e2d7"} Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.594120 4681 scope.go:117] "RemoveContainer" containerID="3e15a9e79a3cce5ab63f4fd1e20b8d917e97cc99c4ea8b7d0244044ae6eb28d8" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.620331 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.625360 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x99fx"] Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.627642 4681 scope.go:117] "RemoveContainer" containerID="1dc2f67cbf804a8edae21e5ad95b4c82fec24e9b739d4ad395a28478b004c4f4" Jan 22 09:57:52 crc kubenswrapper[4681]: I0122 09:57:52.662190 4681 scope.go:117] "RemoveContainer" containerID="effeb63d7bb4ba0db58d057a1e69cf820e86743d1bc286c3a75bf6d97abdc9cb" Jan 22 09:57:53 crc kubenswrapper[4681]: I0122 09:57:53.461790 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" path="/var/lib/kubelet/pods/77454808-a126-4f0e-9daf-d9e1effcb003/volumes" Jan 22 09:57:54 crc kubenswrapper[4681]: I0122 09:57:54.602178 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:54 crc kubenswrapper[4681]: I0122 09:57:54.611067 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-8ms88" podUID="d242700b-26b4-4b83-83d3-1a49852f6cd5" containerName="registry-server" containerID="cri-o://0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0" gracePeriod=2 Jan 22 09:57:54 crc kubenswrapper[4681]: I0122 09:57:54.993058 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.100073 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwdl\" (UniqueName: \"kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl\") pod \"d242700b-26b4-4b83-83d3-1a49852f6cd5\" (UID: \"d242700b-26b4-4b83-83d3-1a49852f6cd5\") " Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.110526 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl" (OuterVolumeSpecName: "kube-api-access-qlwdl") pod "d242700b-26b4-4b83-83d3-1a49852f6cd5" (UID: "d242700b-26b4-4b83-83d3-1a49852f6cd5"). InnerVolumeSpecName "kube-api-access-qlwdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.201871 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwdl\" (UniqueName: \"kubernetes.io/projected/d242700b-26b4-4b83-83d3-1a49852f6cd5-kube-api-access-qlwdl\") on node \"crc\" DevicePath \"\"" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.618255 4681 generic.go:334] "Generic (PLEG): container finished" podID="d242700b-26b4-4b83-83d3-1a49852f6cd5" containerID="0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0" exitCode=0 Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.618341 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-8ms88" event={"ID":"d242700b-26b4-4b83-83d3-1a49852f6cd5","Type":"ContainerDied","Data":"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0"} Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.618338 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-8ms88" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.618382 4681 scope.go:117] "RemoveContainer" containerID="0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.618369 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-8ms88" event={"ID":"d242700b-26b4-4b83-83d3-1a49852f6cd5","Type":"ContainerDied","Data":"ff87d21678e10da3d1da43e97c1ce20205d4f477f64fce42bc5eda5ee2dd042c"} Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.643661 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.647400 4681 scope.go:117] "RemoveContainer" containerID="0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0" Jan 22 09:57:55 crc kubenswrapper[4681]: E0122 09:57:55.647874 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0\": container with ID starting with 0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0 not found: ID does not exist" containerID="0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.647929 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0"} err="failed to get container status \"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0\": rpc error: code = NotFound desc = could not find container \"0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0\": container with ID starting with 0e86edc1be7d648d7c2a52b21f8352940c0646d63eb7d5f46506bd4207c8aea0 not found: ID does not exist" Jan 22 09:57:55 crc kubenswrapper[4681]: I0122 09:57:55.650190 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-8ms88"] Jan 22 09:57:57 crc kubenswrapper[4681]: I0122 09:57:57.461852 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d242700b-26b4-4b83-83d3-1a49852f6cd5" path="/var/lib/kubelet/pods/d242700b-26b4-4b83-83d3-1a49852f6cd5/volumes" Jan 22 09:58:04 crc kubenswrapper[4681]: I0122 09:58:04.453031 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 09:58:05 crc kubenswrapper[4681]: I0122 09:58:05.707229 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21"} Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.284348 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:22 crc kubenswrapper[4681]: E0122 09:58:22.286126 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="extract-utilities" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286151 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="extract-utilities" Jan 22 09:58:22 crc kubenswrapper[4681]: E0122 09:58:22.286165 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286174 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: E0122 09:58:22.286195 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d242700b-26b4-4b83-83d3-1a49852f6cd5" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286204 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="d242700b-26b4-4b83-83d3-1a49852f6cd5" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: E0122 09:58:22.286233 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="extract-content" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286241 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="extract-content" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286955 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="77454808-a126-4f0e-9daf-d9e1effcb003" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.286988 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="d242700b-26b4-4b83-83d3-1a49852f6cd5" containerName="registry-server" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.288158 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.300191 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.360532 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.360754 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrss\" (UniqueName: \"kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.361011 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.462646 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.462729 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrss\" (UniqueName: \"kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.462777 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.463368 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.463758 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.491195 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrss\" (UniqueName: \"kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss\") pod \"redhat-operators-2d4f4\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.614699 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.860840 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:22 crc kubenswrapper[4681]: I0122 09:58:22.879066 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerStarted","Data":"bfc42b65ebc990bec1f7af108861180674dc9d21e031400477315e8c2a0ff6fc"} Jan 22 09:58:23 crc kubenswrapper[4681]: I0122 09:58:23.890821 4681 generic.go:334] "Generic (PLEG): container finished" podID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerID="b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93" exitCode=0 Jan 22 09:58:23 crc kubenswrapper[4681]: I0122 09:58:23.890938 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerDied","Data":"b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93"} Jan 22 09:58:24 crc kubenswrapper[4681]: I0122 09:58:24.907785 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerStarted","Data":"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369"} Jan 22 09:58:25 crc kubenswrapper[4681]: I0122 09:58:25.937122 4681 generic.go:334] "Generic (PLEG): container finished" podID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerID="fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369" exitCode=0 Jan 22 09:58:25 crc kubenswrapper[4681]: I0122 09:58:25.937176 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerDied","Data":"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369"} Jan 22 09:58:30 crc kubenswrapper[4681]: I0122 09:58:30.983151 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerStarted","Data":"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a"} Jan 22 09:58:31 crc kubenswrapper[4681]: I0122 09:58:31.003530 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2d4f4" podStartSLOduration=5.6651663679999995 podStartE2EDuration="9.003511838s" podCreationTimestamp="2026-01-22 09:58:22 +0000 UTC" firstStartedPulling="2026-01-22 09:58:23.892823784 +0000 UTC m=+3294.718734289" lastFinishedPulling="2026-01-22 09:58:27.231169254 +0000 UTC m=+3298.057079759" observedRunningTime="2026-01-22 09:58:30.999052131 +0000 UTC m=+3301.824962646" watchObservedRunningTime="2026-01-22 09:58:31.003511838 +0000 UTC m=+3301.829422353" Jan 22 09:58:32 crc kubenswrapper[4681]: I0122 09:58:32.615387 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:32 crc kubenswrapper[4681]: I0122 09:58:32.615435 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:33 crc kubenswrapper[4681]: I0122 09:58:33.671752 4681 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2d4f4" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="registry-server" probeResult="failure" output=< Jan 22 09:58:33 crc kubenswrapper[4681]: timeout: failed to connect service ":50051" within 1s Jan 22 09:58:33 crc kubenswrapper[4681]: > Jan 22 09:58:42 crc kubenswrapper[4681]: I0122 09:58:42.683952 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:42 crc kubenswrapper[4681]: I0122 09:58:42.740751 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:42 crc kubenswrapper[4681]: I0122 09:58:42.930723 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.092292 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2d4f4" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="registry-server" containerID="cri-o://cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a" gracePeriod=2 Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.531027 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.620516 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content\") pod \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.620572 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities\") pod \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.620720 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqrss\" (UniqueName: \"kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss\") pod \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\" (UID: \"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49\") " Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.622128 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities" (OuterVolumeSpecName: "utilities") pod "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" (UID: "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.630056 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss" (OuterVolumeSpecName: "kube-api-access-vqrss") pod "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" (UID: "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49"). InnerVolumeSpecName "kube-api-access-vqrss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.722971 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.723034 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqrss\" (UniqueName: \"kubernetes.io/projected/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-kube-api-access-vqrss\") on node \"crc\" DevicePath \"\"" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.765891 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" (UID: "7b06b8f4-2afb-46ff-8ddd-363f59f7fd49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 09:58:44 crc kubenswrapper[4681]: I0122 09:58:44.824459 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.103971 4681 generic.go:334] "Generic (PLEG): container finished" podID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerID="cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a" exitCode=0 Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.104047 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d4f4" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.104041 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerDied","Data":"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a"} Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.104708 4681 scope.go:117] "RemoveContainer" containerID="cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.104948 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d4f4" event={"ID":"7b06b8f4-2afb-46ff-8ddd-363f59f7fd49","Type":"ContainerDied","Data":"bfc42b65ebc990bec1f7af108861180674dc9d21e031400477315e8c2a0ff6fc"} Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.136951 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.145167 4681 scope.go:117] "RemoveContainer" containerID="fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.169662 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2d4f4"] Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.170187 4681 scope.go:117] "RemoveContainer" containerID="b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.193117 4681 scope.go:117] "RemoveContainer" containerID="cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a" Jan 22 09:58:45 crc kubenswrapper[4681]: E0122 09:58:45.193547 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a\": container with ID starting with cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a not found: ID does not exist" containerID="cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.193587 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a"} err="failed to get container status \"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a\": rpc error: code = NotFound desc = could not find container \"cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a\": container with ID starting with cbe2c90950d7e0f7337b11957e1d82f845fdd6499209a560071d7df64fee260a not found: ID does not exist" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.193608 4681 scope.go:117] "RemoveContainer" containerID="fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369" Jan 22 09:58:45 crc kubenswrapper[4681]: E0122 09:58:45.194196 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369\": container with ID starting with fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369 not found: ID does not exist" containerID="fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.194254 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369"} err="failed to get container status \"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369\": rpc error: code = NotFound desc = could not find container \"fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369\": container with ID starting with fb6a32fb119e55dfb9e138f48915fd28618eedef138d7506c241dcdda0818369 not found: ID does not exist" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.194350 4681 scope.go:117] "RemoveContainer" containerID="b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93" Jan 22 09:58:45 crc kubenswrapper[4681]: E0122 09:58:45.194693 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93\": container with ID starting with b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93 not found: ID does not exist" containerID="b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.194732 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93"} err="failed to get container status \"b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93\": rpc error: code = NotFound desc = could not find container \"b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93\": container with ID starting with b4e8bb2d8fe88721d42097510dc0a93eda767df0057f9a2d83260b3d1ef60f93 not found: ID does not exist" Jan 22 09:58:45 crc kubenswrapper[4681]: I0122 09:58:45.460885 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" path="/var/lib/kubelet/pods/7b06b8f4-2afb-46ff-8ddd-363f59f7fd49/volumes" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.154162 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9"] Jan 22 10:00:00 crc kubenswrapper[4681]: E0122 10:00:00.155101 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="registry-server" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.155119 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="registry-server" Jan 22 10:00:00 crc kubenswrapper[4681]: E0122 10:00:00.155138 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="extract-content" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.155147 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="extract-content" Jan 22 10:00:00 crc kubenswrapper[4681]: E0122 10:00:00.155161 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="extract-utilities" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.155169 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="extract-utilities" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.155360 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b06b8f4-2afb-46ff-8ddd-363f59f7fd49" containerName="registry-server" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.155898 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.160878 4681 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.160959 4681 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.169616 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9"] Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.260567 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.260637 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.260674 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcfx4\" (UniqueName: \"kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.363574 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.362420 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.363708 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.363754 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcfx4\" (UniqueName: \"kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.389812 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.390936 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcfx4\" (UniqueName: \"kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4\") pod \"collect-profiles-29484600-lkfm9\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.484402 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:00 crc kubenswrapper[4681]: I0122 10:00:00.761379 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9"] Jan 22 10:00:01 crc kubenswrapper[4681]: I0122 10:00:01.120607 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" event={"ID":"7701c7d3-3d61-458e-b89e-111223c0ea33","Type":"ContainerStarted","Data":"b82f64ab20a80ba29f2b15f9997bfed53cd5fea1f5867e81660866a67ee90a1c"} Jan 22 10:00:04 crc kubenswrapper[4681]: I0122 10:00:04.139569 4681 generic.go:334] "Generic (PLEG): container finished" podID="7701c7d3-3d61-458e-b89e-111223c0ea33" containerID="3eabdf324f649f4ef12c41b74496081a9e29ab035c959c0bc6478f4c6ce0447e" exitCode=0 Jan 22 10:00:04 crc kubenswrapper[4681]: I0122 10:00:04.139800 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" event={"ID":"7701c7d3-3d61-458e-b89e-111223c0ea33","Type":"ContainerDied","Data":"3eabdf324f649f4ef12c41b74496081a9e29ab035c959c0bc6478f4c6ce0447e"} Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.411640 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.533769 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcfx4\" (UniqueName: \"kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4\") pod \"7701c7d3-3d61-458e-b89e-111223c0ea33\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.533892 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume\") pod \"7701c7d3-3d61-458e-b89e-111223c0ea33\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.534028 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume\") pod \"7701c7d3-3d61-458e-b89e-111223c0ea33\" (UID: \"7701c7d3-3d61-458e-b89e-111223c0ea33\") " Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.535483 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume" (OuterVolumeSpecName: "config-volume") pod "7701c7d3-3d61-458e-b89e-111223c0ea33" (UID: "7701c7d3-3d61-458e-b89e-111223c0ea33"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.543485 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7701c7d3-3d61-458e-b89e-111223c0ea33" (UID: "7701c7d3-3d61-458e-b89e-111223c0ea33"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.560479 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4" (OuterVolumeSpecName: "kube-api-access-pcfx4") pod "7701c7d3-3d61-458e-b89e-111223c0ea33" (UID: "7701c7d3-3d61-458e-b89e-111223c0ea33"). InnerVolumeSpecName "kube-api-access-pcfx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.635315 4681 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7701c7d3-3d61-458e-b89e-111223c0ea33-config-volume\") on node \"crc\" DevicePath \"\"" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.635357 4681 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7701c7d3-3d61-458e-b89e-111223c0ea33-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 22 10:00:05 crc kubenswrapper[4681]: I0122 10:00:05.635370 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcfx4\" (UniqueName: \"kubernetes.io/projected/7701c7d3-3d61-458e-b89e-111223c0ea33-kube-api-access-pcfx4\") on node \"crc\" DevicePath \"\"" Jan 22 10:00:06 crc kubenswrapper[4681]: I0122 10:00:06.158179 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" event={"ID":"7701c7d3-3d61-458e-b89e-111223c0ea33","Type":"ContainerDied","Data":"b82f64ab20a80ba29f2b15f9997bfed53cd5fea1f5867e81660866a67ee90a1c"} Jan 22 10:00:06 crc kubenswrapper[4681]: I0122 10:00:06.158570 4681 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82f64ab20a80ba29f2b15f9997bfed53cd5fea1f5867e81660866a67ee90a1c" Jan 22 10:00:06 crc kubenswrapper[4681]: I0122 10:00:06.158650 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29484600-lkfm9" Jan 22 10:00:06 crc kubenswrapper[4681]: I0122 10:00:06.513920 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66"] Jan 22 10:00:06 crc kubenswrapper[4681]: I0122 10:00:06.521131 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29484555-wfh66"] Jan 22 10:00:07 crc kubenswrapper[4681]: I0122 10:00:07.467528 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed18b43b-6da8-420f-86b0-2a3ec92f60ab" path="/var/lib/kubelet/pods/ed18b43b-6da8-420f-86b0-2a3ec92f60ab/volumes" Jan 22 10:00:26 crc kubenswrapper[4681]: I0122 10:00:26.031791 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:00:26 crc kubenswrapper[4681]: I0122 10:00:26.032402 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:00:31 crc kubenswrapper[4681]: I0122 10:00:31.306237 4681 scope.go:117] "RemoveContainer" containerID="bb4c3e0d3b68af662f1157338e9ba37923156bac785060429df0ad07d300114a" Jan 22 10:00:56 crc kubenswrapper[4681]: I0122 10:00:56.033451 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:00:56 crc kubenswrapper[4681]: I0122 10:00:56.033949 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.030894 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.031364 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.031410 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.032025 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.032071 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21" gracePeriod=600 Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.843441 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21" exitCode=0 Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.844188 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21"} Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.844244 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerStarted","Data":"5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943"} Jan 22 10:01:26 crc kubenswrapper[4681]: I0122 10:01:26.844282 4681 scope.go:117] "RemoveContainer" containerID="3ec6c0b0d23a85897cf561e30e5858fdbb02235a695edc970bbe41db052b7ae0" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.320184 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:02:51 crc kubenswrapper[4681]: E0122 10:02:51.321031 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7701c7d3-3d61-458e-b89e-111223c0ea33" containerName="collect-profiles" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.321043 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="7701c7d3-3d61-458e-b89e-111223c0ea33" containerName="collect-profiles" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.321179 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="7701c7d3-3d61-458e-b89e-111223c0ea33" containerName="collect-profiles" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.321584 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.353893 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.442097 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhnr\" (UniqueName: \"kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr\") pod \"service-telemetry-framework-operators-d7ppt\" (UID: \"727abac3-db64-4c3f-bc8f-a071e3cb7513\") " pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.547198 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhnr\" (UniqueName: \"kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr\") pod \"service-telemetry-framework-operators-d7ppt\" (UID: \"727abac3-db64-4c3f-bc8f-a071e3cb7513\") " pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.579170 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhnr\" (UniqueName: \"kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr\") pod \"service-telemetry-framework-operators-d7ppt\" (UID: \"727abac3-db64-4c3f-bc8f-a071e3cb7513\") " pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:02:51 crc kubenswrapper[4681]: I0122 10:02:51.671845 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:02:52 crc kubenswrapper[4681]: I0122 10:02:52.099552 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:02:52 crc kubenswrapper[4681]: W0122 10:02:52.110364 4681 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727abac3_db64_4c3f_bc8f_a071e3cb7513.slice/crio-da359f06fce190b26b119dda0c44068c63163227b33a84afd05564408c9a2606 WatchSource:0}: Error finding container da359f06fce190b26b119dda0c44068c63163227b33a84afd05564408c9a2606: Status 404 returned error can't find the container with id da359f06fce190b26b119dda0c44068c63163227b33a84afd05564408c9a2606 Jan 22 10:02:52 crc kubenswrapper[4681]: I0122 10:02:52.113338 4681 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 22 10:02:52 crc kubenswrapper[4681]: I0122 10:02:52.707948 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" event={"ID":"727abac3-db64-4c3f-bc8f-a071e3cb7513","Type":"ContainerStarted","Data":"da359f06fce190b26b119dda0c44068c63163227b33a84afd05564408c9a2606"} Jan 22 10:02:55 crc kubenswrapper[4681]: I0122 10:02:55.730380 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" event={"ID":"727abac3-db64-4c3f-bc8f-a071e3cb7513","Type":"ContainerStarted","Data":"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34"} Jan 22 10:02:55 crc kubenswrapper[4681]: I0122 10:02:55.745712 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" podStartSLOduration=1.9104960549999999 podStartE2EDuration="4.745690649s" podCreationTimestamp="2026-01-22 10:02:51 +0000 UTC" firstStartedPulling="2026-01-22 10:02:52.112928107 +0000 UTC m=+3562.938838612" lastFinishedPulling="2026-01-22 10:02:54.948122701 +0000 UTC m=+3565.774033206" observedRunningTime="2026-01-22 10:02:55.74381448 +0000 UTC m=+3566.569724985" watchObservedRunningTime="2026-01-22 10:02:55.745690649 +0000 UTC m=+3566.571601154" Jan 22 10:03:01 crc kubenswrapper[4681]: I0122 10:03:01.672359 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:01 crc kubenswrapper[4681]: I0122 10:03:01.672961 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:01 crc kubenswrapper[4681]: I0122 10:03:01.704390 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:01 crc kubenswrapper[4681]: I0122 10:03:01.814015 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:02 crc kubenswrapper[4681]: I0122 10:03:02.091810 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:03:03 crc kubenswrapper[4681]: I0122 10:03:03.791690 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" podUID="727abac3-db64-4c3f-bc8f-a071e3cb7513" containerName="registry-server" containerID="cri-o://207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34" gracePeriod=2 Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.182434 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.379070 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dhnr\" (UniqueName: \"kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr\") pod \"727abac3-db64-4c3f-bc8f-a071e3cb7513\" (UID: \"727abac3-db64-4c3f-bc8f-a071e3cb7513\") " Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.385118 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr" (OuterVolumeSpecName: "kube-api-access-5dhnr") pod "727abac3-db64-4c3f-bc8f-a071e3cb7513" (UID: "727abac3-db64-4c3f-bc8f-a071e3cb7513"). InnerVolumeSpecName "kube-api-access-5dhnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.481915 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dhnr\" (UniqueName: \"kubernetes.io/projected/727abac3-db64-4c3f-bc8f-a071e3cb7513-kube-api-access-5dhnr\") on node \"crc\" DevicePath \"\"" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.799987 4681 generic.go:334] "Generic (PLEG): container finished" podID="727abac3-db64-4c3f-bc8f-a071e3cb7513" containerID="207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34" exitCode=0 Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.800069 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.800071 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" event={"ID":"727abac3-db64-4c3f-bc8f-a071e3cb7513","Type":"ContainerDied","Data":"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34"} Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.800179 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-operators-d7ppt" event={"ID":"727abac3-db64-4c3f-bc8f-a071e3cb7513","Type":"ContainerDied","Data":"da359f06fce190b26b119dda0c44068c63163227b33a84afd05564408c9a2606"} Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.800213 4681 scope.go:117] "RemoveContainer" containerID="207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.826237 4681 scope.go:117] "RemoveContainer" containerID="207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34" Jan 22 10:03:04 crc kubenswrapper[4681]: E0122 10:03:04.826877 4681 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34\": container with ID starting with 207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34 not found: ID does not exist" containerID="207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.826921 4681 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34"} err="failed to get container status \"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34\": rpc error: code = NotFound desc = could not find container \"207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34\": container with ID starting with 207c0e0c01dfdcb5750a74d76057a301dda4a37a15bdb4946fc991d23a8c3f34 not found: ID does not exist" Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.843476 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:03:04 crc kubenswrapper[4681]: I0122 10:03:04.849939 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-framework-operators-d7ppt"] Jan 22 10:03:05 crc kubenswrapper[4681]: I0122 10:03:05.466991 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727abac3-db64-4c3f-bc8f-a071e3cb7513" path="/var/lib/kubelet/pods/727abac3-db64-4c3f-bc8f-a071e3cb7513/volumes" Jan 22 10:03:26 crc kubenswrapper[4681]: I0122 10:03:26.031744 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:03:26 crc kubenswrapper[4681]: I0122 10:03:26.032816 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:03:56 crc kubenswrapper[4681]: I0122 10:03:56.031089 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:03:56 crc kubenswrapper[4681]: I0122 10:03:56.031706 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:04:05 crc kubenswrapper[4681]: I0122 10:04:05.929302 4681 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rghgq"] Jan 22 10:04:05 crc kubenswrapper[4681]: E0122 10:04:05.930214 4681 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727abac3-db64-4c3f-bc8f-a071e3cb7513" containerName="registry-server" Jan 22 10:04:05 crc kubenswrapper[4681]: I0122 10:04:05.930229 4681 state_mem.go:107] "Deleted CPUSet assignment" podUID="727abac3-db64-4c3f-bc8f-a071e3cb7513" containerName="registry-server" Jan 22 10:04:05 crc kubenswrapper[4681]: I0122 10:04:05.930423 4681 memory_manager.go:354] "RemoveStaleState removing state" podUID="727abac3-db64-4c3f-bc8f-a071e3cb7513" containerName="registry-server" Jan 22 10:04:05 crc kubenswrapper[4681]: I0122 10:04:05.931707 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:05 crc kubenswrapper[4681]: I0122 10:04:05.942197 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rghgq"] Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.068140 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfl5\" (UniqueName: \"kubernetes.io/projected/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-kube-api-access-shfl5\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.068209 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-catalog-content\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.068349 4681 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-utilities\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.169360 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfl5\" (UniqueName: \"kubernetes.io/projected/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-kube-api-access-shfl5\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.169408 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-catalog-content\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.169433 4681 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-utilities\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.169954 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-utilities\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.170201 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-catalog-content\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.193889 4681 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfl5\" (UniqueName: \"kubernetes.io/projected/1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1-kube-api-access-shfl5\") pod \"certified-operators-rghgq\" (UID: \"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1\") " pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.271749 4681 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:06 crc kubenswrapper[4681]: I0122 10:04:06.593721 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rghgq"] Jan 22 10:04:07 crc kubenswrapper[4681]: I0122 10:04:07.298882 4681 generic.go:334] "Generic (PLEG): container finished" podID="1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1" containerID="45c528729dec50f43c9faf6e3ce14c1dda2e401039e298423505677f23403f05" exitCode=0 Jan 22 10:04:07 crc kubenswrapper[4681]: I0122 10:04:07.298994 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rghgq" event={"ID":"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1","Type":"ContainerDied","Data":"45c528729dec50f43c9faf6e3ce14c1dda2e401039e298423505677f23403f05"} Jan 22 10:04:07 crc kubenswrapper[4681]: I0122 10:04:07.299041 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rghgq" event={"ID":"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1","Type":"ContainerStarted","Data":"6032f7c07620ff77f17f91e6d62791eb63d1857a6fda30e58b312b634ff6daf8"} Jan 22 10:04:15 crc kubenswrapper[4681]: I0122 10:04:15.371603 4681 generic.go:334] "Generic (PLEG): container finished" podID="1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1" containerID="74c6b46ed31254cd0541850df560093ed1a6f51ad38426a5480fb2503dc6353e" exitCode=0 Jan 22 10:04:15 crc kubenswrapper[4681]: I0122 10:04:15.372005 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rghgq" event={"ID":"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1","Type":"ContainerDied","Data":"74c6b46ed31254cd0541850df560093ed1a6f51ad38426a5480fb2503dc6353e"} Jan 22 10:04:16 crc kubenswrapper[4681]: I0122 10:04:16.384175 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rghgq" event={"ID":"1b8df6b2-ddce-4a8d-982d-fca07ac3f0f1","Type":"ContainerStarted","Data":"5f59ad6d910ef29e02f83361017a1bd9207336a5e0cf4f36b31e8d2d63fb7d78"} Jan 22 10:04:16 crc kubenswrapper[4681]: I0122 10:04:16.404015 4681 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rghgq" podStartSLOduration=2.961592853 podStartE2EDuration="11.403982615s" podCreationTimestamp="2026-01-22 10:04:05 +0000 UTC" firstStartedPulling="2026-01-22 10:04:07.302703191 +0000 UTC m=+3638.128613706" lastFinishedPulling="2026-01-22 10:04:15.745092953 +0000 UTC m=+3646.571003468" observedRunningTime="2026-01-22 10:04:16.403302897 +0000 UTC m=+3647.229213402" watchObservedRunningTime="2026-01-22 10:04:16.403982615 +0000 UTC m=+3647.229907581" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.032186 4681 patch_prober.go:28] interesting pod/machine-config-daemon-zb7wn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.035132 4681 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.035407 4681 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.036946 4681 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943"} pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.037599 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerName="machine-config-daemon" containerID="cri-o://5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" gracePeriod=600 Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.272948 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.273303 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.323608 4681 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.477317 4681 generic.go:334] "Generic (PLEG): container finished" podID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" exitCode=0 Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.477461 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" event={"ID":"d58a61a8-a6b2-4af6-92a6-c7bf6da6a432","Type":"ContainerDied","Data":"5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943"} Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.477577 4681 scope.go:117] "RemoveContainer" containerID="5a0ef703eb0854d8032ccfc4597b68af7af2123887c04b1946b6f324a63c5f21" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.519703 4681 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rghgq" Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.625233 4681 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rghgq"] Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.665688 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 10:04:26 crc kubenswrapper[4681]: I0122 10:04:26.666701 4681 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7jtrb" podUID="94bafc7c-e628-4827-b1b7-f016d562bd9f" containerName="registry-server" containerID="cri-o://213843b50db23e594e7d27cbd5d530e8e3b588e3fbf79ca6f0d351540da5de56" gracePeriod=2 Jan 22 10:04:28 crc kubenswrapper[4681]: E0122 10:04:28.108122 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:04:28 crc kubenswrapper[4681]: I0122 10:04:28.497214 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:04:28 crc kubenswrapper[4681]: E0122 10:04:28.497868 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:04:28 crc kubenswrapper[4681]: I0122 10:04:28.502812 4681 generic.go:334] "Generic (PLEG): container finished" podID="94bafc7c-e628-4827-b1b7-f016d562bd9f" containerID="213843b50db23e594e7d27cbd5d530e8e3b588e3fbf79ca6f0d351540da5de56" exitCode=0 Jan 22 10:04:28 crc kubenswrapper[4681]: I0122 10:04:28.502853 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerDied","Data":"213843b50db23e594e7d27cbd5d530e8e3b588e3fbf79ca6f0d351540da5de56"} Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.160171 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.301506 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfkfm\" (UniqueName: \"kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm\") pod \"94bafc7c-e628-4827-b1b7-f016d562bd9f\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.301758 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities\") pod \"94bafc7c-e628-4827-b1b7-f016d562bd9f\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.303147 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities" (OuterVolumeSpecName: "utilities") pod "94bafc7c-e628-4827-b1b7-f016d562bd9f" (UID: "94bafc7c-e628-4827-b1b7-f016d562bd9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.303345 4681 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content\") pod \"94bafc7c-e628-4827-b1b7-f016d562bd9f\" (UID: \"94bafc7c-e628-4827-b1b7-f016d562bd9f\") " Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.303938 4681 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-utilities\") on node \"crc\" DevicePath \"\"" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.307134 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm" (OuterVolumeSpecName: "kube-api-access-vfkfm") pod "94bafc7c-e628-4827-b1b7-f016d562bd9f" (UID: "94bafc7c-e628-4827-b1b7-f016d562bd9f"). InnerVolumeSpecName "kube-api-access-vfkfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.340375 4681 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94bafc7c-e628-4827-b1b7-f016d562bd9f" (UID: "94bafc7c-e628-4827-b1b7-f016d562bd9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.405311 4681 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfkfm\" (UniqueName: \"kubernetes.io/projected/94bafc7c-e628-4827-b1b7-f016d562bd9f-kube-api-access-vfkfm\") on node \"crc\" DevicePath \"\"" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.405359 4681 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafc7c-e628-4827-b1b7-f016d562bd9f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.515407 4681 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7jtrb" event={"ID":"94bafc7c-e628-4827-b1b7-f016d562bd9f","Type":"ContainerDied","Data":"521a74266c24908fe8216a027bd04ad51e37d4c71e22fe7d16f0bf49a20c6d96"} Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.515487 4681 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7jtrb" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.515540 4681 scope.go:117] "RemoveContainer" containerID="213843b50db23e594e7d27cbd5d530e8e3b588e3fbf79ca6f0d351540da5de56" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.547889 4681 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.554901 4681 scope.go:117] "RemoveContainer" containerID="91c115c5efcb89e764b409966cb61141f03a027a34dbccddf0e1df7bb5d56c44" Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.555535 4681 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7jtrb"] Jan 22 10:04:29 crc kubenswrapper[4681]: I0122 10:04:29.587192 4681 scope.go:117] "RemoveContainer" containerID="b8a3b890efb1a9deeac7f0ba9e3f37ad70b72569f722f8f9fba26c690c9537e3" Jan 22 10:04:31 crc kubenswrapper[4681]: I0122 10:04:31.475470 4681 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94bafc7c-e628-4827-b1b7-f016d562bd9f" path="/var/lib/kubelet/pods/94bafc7c-e628-4827-b1b7-f016d562bd9f/volumes" Jan 22 10:04:41 crc kubenswrapper[4681]: I0122 10:04:41.452336 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:04:41 crc kubenswrapper[4681]: E0122 10:04:41.453048 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:04:56 crc kubenswrapper[4681]: I0122 10:04:56.452712 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:04:56 crc kubenswrapper[4681]: E0122 10:04:56.453447 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:05:11 crc kubenswrapper[4681]: I0122 10:05:11.452978 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:05:11 crc kubenswrapper[4681]: E0122 10:05:11.453843 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:05:24 crc kubenswrapper[4681]: I0122 10:05:24.452552 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:05:24 crc kubenswrapper[4681]: E0122 10:05:24.453353 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:05:35 crc kubenswrapper[4681]: I0122 10:05:35.453472 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:05:35 crc kubenswrapper[4681]: E0122 10:05:35.456875 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:05:48 crc kubenswrapper[4681]: I0122 10:05:48.453985 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:05:48 crc kubenswrapper[4681]: E0122 10:05:48.455517 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:06:02 crc kubenswrapper[4681]: I0122 10:06:02.452726 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:06:02 crc kubenswrapper[4681]: E0122 10:06:02.453690 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:06:16 crc kubenswrapper[4681]: I0122 10:06:16.453589 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:06:16 crc kubenswrapper[4681]: E0122 10:06:16.454684 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:06:28 crc kubenswrapper[4681]: I0122 10:06:28.452849 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:06:28 crc kubenswrapper[4681]: E0122 10:06:28.453788 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:06:39 crc kubenswrapper[4681]: I0122 10:06:39.455994 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:06:39 crc kubenswrapper[4681]: E0122 10:06:39.456806 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:06:52 crc kubenswrapper[4681]: I0122 10:06:52.452772 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:06:52 crc kubenswrapper[4681]: E0122 10:06:52.453468 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:07:06 crc kubenswrapper[4681]: I0122 10:07:06.454203 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:07:06 crc kubenswrapper[4681]: E0122 10:07:06.455128 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:07:21 crc kubenswrapper[4681]: I0122 10:07:21.452061 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:07:21 crc kubenswrapper[4681]: E0122 10:07:21.452870 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:07:32 crc kubenswrapper[4681]: I0122 10:07:32.453021 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:07:32 crc kubenswrapper[4681]: E0122 10:07:32.453877 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:07:44 crc kubenswrapper[4681]: I0122 10:07:44.452621 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:07:44 crc kubenswrapper[4681]: E0122 10:07:44.453459 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432" Jan 22 10:07:58 crc kubenswrapper[4681]: I0122 10:07:58.454064 4681 scope.go:117] "RemoveContainer" containerID="5324687a113de6fe5b57d72b25c92a11cf6d519592a83690fc6acac7c253e943" Jan 22 10:07:58 crc kubenswrapper[4681]: E0122 10:07:58.455193 4681 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zb7wn_openshift-machine-config-operator(d58a61a8-a6b2-4af6-92a6-c7bf6da6a432)\"" pod="openshift-machine-config-operator/machine-config-daemon-zb7wn" podUID="d58a61a8-a6b2-4af6-92a6-c7bf6da6a432"