Mar 20 12:27:16 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 12:27:16 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:16 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 12:27:17 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 12:27:18 crc kubenswrapper[4817]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.368756 4817 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377333 4817 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377412 4817 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377422 4817 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377430 4817 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377439 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377446 4817 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377455 4817 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377498 4817 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377507 4817 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377517 4817 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377526 4817 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377536 4817 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377546 4817 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377554 4817 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377562 4817 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377570 4817 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377581 4817 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377591 4817 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377601 4817 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377609 4817 feature_gate.go:330] unrecognized feature gate: Example Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377617 4817 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377625 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377632 4817 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377642 4817 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377649 4817 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377657 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377665 4817 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377672 4817 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377683 4817 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377691 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377699 4817 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377708 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377716 4817 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377737 4817 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377746 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377754 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377763 4817 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377772 4817 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377780 4817 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377788 4817 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377797 4817 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377806 4817 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377814 4817 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377822 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377831 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377840 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377849 4817 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377857 4817 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377865 4817 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377873 4817 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377881 4817 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377889 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377897 4817 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377905 4817 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377912 4817 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377920 4817 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377927 4817 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377935 4817 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377944 4817 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377952 4817 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377960 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377967 4817 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377975 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377982 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377989 4817 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.377997 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.378004 4817 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.378012 4817 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.378020 4817 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.378027 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.378037 4817 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381393 4817 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381432 4817 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381450 4817 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381461 4817 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381473 4817 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381517 4817 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381529 4817 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381540 4817 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381550 4817 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381559 4817 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381569 4817 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381580 4817 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381589 4817 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381598 4817 flags.go:64] FLAG: --cgroup-root="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381607 4817 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381616 4817 flags.go:64] FLAG: --client-ca-file="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381624 4817 flags.go:64] FLAG: --cloud-config="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381634 4817 flags.go:64] FLAG: --cloud-provider="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381642 4817 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381654 4817 flags.go:64] FLAG: --cluster-domain="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381663 4817 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381673 4817 flags.go:64] FLAG: --config-dir="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381682 4817 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381693 4817 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381703 4817 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381712 4817 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381721 4817 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381731 4817 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381740 4817 flags.go:64] FLAG: --contention-profiling="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381749 4817 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381758 4817 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381767 4817 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381776 4817 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381787 4817 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381796 4817 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381805 4817 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381813 4817 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381822 4817 flags.go:64] FLAG: --enable-server="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381831 4817 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381842 4817 flags.go:64] FLAG: --event-burst="100" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381852 4817 flags.go:64] FLAG: --event-qps="50" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381861 4817 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381870 4817 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381879 4817 flags.go:64] FLAG: --eviction-hard="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381889 4817 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381898 4817 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381907 4817 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381917 4817 flags.go:64] FLAG: --eviction-soft="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381926 4817 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381935 4817 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381944 4817 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381952 4817 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381961 4817 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381970 4817 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381979 4817 flags.go:64] FLAG: --feature-gates="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381989 4817 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.381999 4817 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382007 4817 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382017 4817 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382026 4817 flags.go:64] FLAG: --healthz-port="10248" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382034 4817 flags.go:64] FLAG: --help="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382043 4817 flags.go:64] FLAG: --hostname-override="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382052 4817 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382061 4817 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382070 4817 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382078 4817 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382087 4817 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382096 4817 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382104 4817 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382113 4817 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382147 4817 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382157 4817 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382166 4817 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382175 4817 flags.go:64] FLAG: --kube-reserved="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382184 4817 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382192 4817 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382201 4817 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382210 4817 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382219 4817 flags.go:64] FLAG: --lock-file="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382229 4817 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382238 4817 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382247 4817 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382273 4817 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382283 4817 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382292 4817 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382301 4817 flags.go:64] FLAG: --logging-format="text" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382310 4817 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382320 4817 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382328 4817 flags.go:64] FLAG: --manifest-url="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382337 4817 flags.go:64] FLAG: --manifest-url-header="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382349 4817 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382358 4817 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382369 4817 flags.go:64] FLAG: --max-pods="110" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382378 4817 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382389 4817 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382398 4817 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382407 4817 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382416 4817 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382425 4817 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382434 4817 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382453 4817 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382462 4817 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382471 4817 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382480 4817 flags.go:64] FLAG: --pod-cidr="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382489 4817 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382501 4817 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382509 4817 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382518 4817 flags.go:64] FLAG: --pods-per-core="0" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382527 4817 flags.go:64] FLAG: --port="10250" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382536 4817 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382545 4817 flags.go:64] FLAG: --provider-id="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382553 4817 flags.go:64] FLAG: --qos-reserved="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382562 4817 flags.go:64] FLAG: --read-only-port="10255" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382571 4817 flags.go:64] FLAG: --register-node="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382580 4817 flags.go:64] FLAG: --register-schedulable="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382588 4817 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382609 4817 flags.go:64] FLAG: --registry-burst="10" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382618 4817 flags.go:64] FLAG: --registry-qps="5" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382626 4817 flags.go:64] FLAG: --reserved-cpus="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382637 4817 flags.go:64] FLAG: --reserved-memory="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382648 4817 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382658 4817 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382667 4817 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382676 4817 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382685 4817 flags.go:64] FLAG: --runonce="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382694 4817 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382704 4817 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382713 4817 flags.go:64] FLAG: --seccomp-default="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382721 4817 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382730 4817 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382739 4817 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382748 4817 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382757 4817 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382766 4817 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382774 4817 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382783 4817 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382792 4817 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382801 4817 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382810 4817 flags.go:64] FLAG: --system-cgroups="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382819 4817 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382833 4817 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382841 4817 flags.go:64] FLAG: --tls-cert-file="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382850 4817 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382860 4817 flags.go:64] FLAG: --tls-min-version="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382869 4817 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382877 4817 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382886 4817 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382895 4817 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382903 4817 flags.go:64] FLAG: --v="2" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382914 4817 flags.go:64] FLAG: --version="false" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382926 4817 flags.go:64] FLAG: --vmodule="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382936 4817 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.382948 4817 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383173 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383185 4817 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383196 4817 feature_gate.go:330] unrecognized feature gate: Example Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383205 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383214 4817 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383222 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383233 4817 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383243 4817 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383253 4817 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383262 4817 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383270 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383280 4817 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383290 4817 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383299 4817 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383308 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383316 4817 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383326 4817 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383333 4817 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383341 4817 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383349 4817 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383357 4817 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383365 4817 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383373 4817 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383380 4817 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383388 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383395 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383403 4817 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383410 4817 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383418 4817 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383425 4817 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383433 4817 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383441 4817 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383449 4817 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383456 4817 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383463 4817 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383471 4817 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383479 4817 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383488 4817 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383499 4817 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383508 4817 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383517 4817 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383524 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383532 4817 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383539 4817 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383547 4817 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383554 4817 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383562 4817 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383570 4817 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383577 4817 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383585 4817 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383592 4817 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383600 4817 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383610 4817 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383619 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383627 4817 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383634 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383642 4817 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383650 4817 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383658 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383666 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383674 4817 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383681 4817 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383695 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383703 4817 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383713 4817 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383723 4817 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383732 4817 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383740 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383749 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383756 4817 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.383764 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.383788 4817 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.396240 4817 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.396297 4817 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396438 4817 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396453 4817 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396463 4817 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396474 4817 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396483 4817 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396492 4817 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396504 4817 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396516 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396526 4817 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396536 4817 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396544 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396553 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396560 4817 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396568 4817 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396578 4817 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396588 4817 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396596 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396604 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396611 4817 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396619 4817 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396627 4817 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396636 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396645 4817 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396653 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396662 4817 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396670 4817 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396678 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396686 4817 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396693 4817 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396703 4817 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396710 4817 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396719 4817 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396727 4817 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396735 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396744 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396752 4817 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396759 4817 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396768 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396776 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396784 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396791 4817 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396799 4817 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396807 4817 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396815 4817 feature_gate.go:330] unrecognized feature gate: Example Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396823 4817 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396831 4817 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396838 4817 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396846 4817 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396855 4817 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396862 4817 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396871 4817 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396881 4817 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396888 4817 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396899 4817 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396909 4817 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396918 4817 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396927 4817 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396935 4817 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396945 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396953 4817 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396961 4817 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396969 4817 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396978 4817 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396986 4817 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.396994 4817 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397002 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397010 4817 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397018 4817 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397025 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397035 4817 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397045 4817 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.397059 4817 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397313 4817 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397327 4817 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397337 4817 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397346 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397355 4817 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397362 4817 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397371 4817 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397379 4817 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397387 4817 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397394 4817 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397402 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397410 4817 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397419 4817 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397427 4817 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397434 4817 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397442 4817 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397450 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397458 4817 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397466 4817 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397474 4817 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397482 4817 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397490 4817 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397499 4817 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397506 4817 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397514 4817 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397521 4817 feature_gate.go:330] unrecognized feature gate: Example Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397529 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397541 4817 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397573 4817 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397582 4817 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397590 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397599 4817 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397607 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397615 4817 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397623 4817 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397631 4817 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397638 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397648 4817 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397656 4817 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397664 4817 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397671 4817 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397679 4817 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397689 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397697 4817 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397708 4817 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397718 4817 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397727 4817 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397736 4817 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397745 4817 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397754 4817 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397762 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397771 4817 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397779 4817 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397816 4817 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397826 4817 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397834 4817 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397842 4817 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397850 4817 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397857 4817 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397866 4817 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397873 4817 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397881 4817 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397888 4817 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397897 4817 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397905 4817 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397915 4817 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397923 4817 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397931 4817 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397938 4817 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397949 4817 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.397959 4817 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.397971 4817 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.398252 4817 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.406472 4817 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.410901 4817 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.411048 4817 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.414256 4817 server.go:997] "Starting client certificate rotation" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.414316 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.414582 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.444573 4817 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.447678 4817 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.447898 4817 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.467199 4817 log.go:25] "Validated CRI v1 runtime API" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.508337 4817 log.go:25] "Validated CRI v1 image API" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.514164 4817 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.521335 4817 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-12-22-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.521380 4817 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.551227 4817 manager.go:217] Machine: {Timestamp:2026-03-20 12:27:18.546639518 +0000 UTC m=+0.634952341 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8b6acc6f-72ef-432d-a377-611bb5e5be3b BootID:1ed2f349-5c70-4bd5-a6f5-330128fd6277 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0b:66:b5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0b:66:b5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:52:6c:e6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f2:6f:33 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2a:80:37 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ab:b9:f2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:46:88:e8:7c:e0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:3d:4c:80:6e:59 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.551601 4817 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.551809 4817 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.554736 4817 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.555066 4817 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.555165 4817 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.555505 4817 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.555527 4817 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.556604 4817 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.556649 4817 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.557010 4817 state_mem.go:36] "Initialized new in-memory state store" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.557212 4817 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.561852 4817 kubelet.go:418] "Attempting to sync node with API server" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.561890 4817 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.561928 4817 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.561949 4817 kubelet.go:324] "Adding apiserver pod source" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.561966 4817 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.568703 4817 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.570400 4817 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.572213 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.572236 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.572334 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.572352 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.573114 4817 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575289 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575327 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575342 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575355 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575376 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575390 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575402 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575423 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575439 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575453 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575472 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.575485 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.579013 4817 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.579654 4817 server.go:1280] "Started kubelet" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.580417 4817 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.580684 4817 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.581634 4817 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 12:27:18 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.587759 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.589486 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.589556 4817 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.590095 4817 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.590171 4817 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.590440 4817 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.591548 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.590183 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.591498 4817 server.go:460] "Adding debug handlers to kubelet server" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.592723 4817 factory.go:55] Registering systemd factory Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.593047 4817 factory.go:221] Registration of the systemd container factory successfully Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.592942 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.593164 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.592341 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8c5dffa81a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,LastTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.593903 4817 factory.go:153] Registering CRI-O factory Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.593978 4817 factory.go:221] Registration of the crio container factory successfully Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.594096 4817 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.594182 4817 factory.go:103] Registering Raw factory Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.594218 4817 manager.go:1196] Started watching for new ooms in manager Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.598181 4817 manager.go:319] Starting recovery of all containers Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.618685 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619063 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619096 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619159 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619185 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619204 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619224 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619243 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619274 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619295 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619313 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619332 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619351 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619373 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619391 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619410 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619431 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619451 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619469 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619489 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619507 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619525 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619544 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619563 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619583 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619601 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619623 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619643 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619662 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619681 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619698 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619718 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619771 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619792 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619813 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619831 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619850 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619869 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619890 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619952 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.619974 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.620002 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622407 4817 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622455 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622477 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622501 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622523 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622582 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622607 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622627 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622646 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622665 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622684 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622709 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622731 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622753 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622773 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622815 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622836 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622855 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622873 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622893 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622911 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622930 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622949 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622968 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.622987 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623004 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623023 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623042 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623062 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623102 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623154 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623183 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623205 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623223 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623243 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623261 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623281 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623300 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623319 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623520 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623548 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623573 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623603 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623627 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623646 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623667 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623686 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623704 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623725 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623744 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623768 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623795 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623821 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623844 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623871 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623895 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623918 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623942 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623968 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.623993 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624022 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624048 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624073 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624116 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624190 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624219 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624250 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624280 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624306 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624332 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624361 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624389 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624422 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624447 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624473 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624497 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624521 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624546 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624572 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624598 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624621 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624646 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624672 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624698 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624725 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624750 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624779 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624809 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624839 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624865 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624889 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624913 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624938 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624965 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.624989 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625014 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625041 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625066 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625090 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625116 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625183 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625224 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625252 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625277 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625305 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625329 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625531 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625556 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625583 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625608 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625633 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625660 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625686 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625711 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625737 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625763 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625788 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625816 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625844 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625869 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625895 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625921 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625947 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625971 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.625998 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626022 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626047 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626075 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626100 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626159 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626189 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626216 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626242 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626266 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626293 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626318 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626343 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626369 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626395 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626428 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626455 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626479 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626507 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626532 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626558 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626636 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626665 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626692 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626715 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626741 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626768 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626793 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626816 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626842 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626868 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626896 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626920 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626945 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626971 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.626995 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627019 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627043 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627068 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627095 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627153 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627183 4817 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627209 4817 reconstruct.go:97] "Volume reconstruction finished" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.627226 4817 reconciler.go:26] "Reconciler: start to sync state" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.635026 4817 manager.go:324] Recovery completed Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.651404 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.653271 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.653303 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.653315 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.654411 4817 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.654448 4817 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.654480 4817 state_mem.go:36] "Initialized new in-memory state store" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.659738 4817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.662106 4817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.662157 4817 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.662182 4817 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.662226 4817 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 12:27:18 crc kubenswrapper[4817]: W0320 12:27:18.662898 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.663097 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.667382 4817 policy_none.go:49] "None policy: Start" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.669529 4817 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.669555 4817 state_mem.go:35] "Initializing new in-memory state store" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.691982 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.726446 4817 manager.go:334] "Starting Device Plugin manager" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.726733 4817 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.726759 4817 server.go:79] "Starting device plugin registration server" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.727304 4817 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.727325 4817 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.727911 4817 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.728010 4817 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.728021 4817 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.737744 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.763138 4817 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.763314 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.764641 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.764692 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.764710 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.764890 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.765174 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.765216 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.765912 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.765935 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.765946 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766021 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766055 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766069 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766219 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766367 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.766401 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767185 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767252 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767269 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767349 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767374 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767374 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767469 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767431 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.767661 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768025 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768057 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768068 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768210 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768409 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768455 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768837 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768865 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768878 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.768990 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769011 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769024 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769152 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769169 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769179 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769244 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.769280 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.771197 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.771256 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.771271 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.792378 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829318 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829377 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829431 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829471 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829508 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829541 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829574 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829716 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829765 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829809 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829849 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829942 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.829981 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.830012 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.830965 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.831035 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.831055 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.831095 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:18 crc kubenswrapper[4817]: E0320 12:27:18.831907 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931716 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931790 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931825 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931855 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931883 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931918 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.931981 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932015 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932045 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932071 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932102 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932161 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932190 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932175 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932219 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932218 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932273 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932185 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932300 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932324 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932367 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932285 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932251 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932331 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932231 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932242 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932346 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932346 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932274 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:18 crc kubenswrapper[4817]: I0320 12:27:18.932640 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.032606 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.034419 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.034472 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.034485 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.034517 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.035050 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.109231 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.137321 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.158353 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.159423 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9e8fc3300ff711b27ff61fbf8268055dcecb8576ddd06fef6a76f5094b8c3269 WatchSource:0}: Error finding container 9e8fc3300ff711b27ff61fbf8268055dcecb8576ddd06fef6a76f5094b8c3269: Status 404 returned error can't find the container with id 9e8fc3300ff711b27ff61fbf8268055dcecb8576ddd06fef6a76f5094b8c3269 Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.180538 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b5d9c6d919bd6232633e414fdb9f8f8408034f4cbe2a424d9c0927fc8b25aa23 WatchSource:0}: Error finding container b5d9c6d919bd6232633e414fdb9f8f8408034f4cbe2a424d9c0927fc8b25aa23: Status 404 returned error can't find the container with id b5d9c6d919bd6232633e414fdb9f8f8408034f4cbe2a424d9c0927fc8b25aa23 Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.182617 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f180540322689ea9c55460335d122220724513b7005c672a0d217124f2077e53 WatchSource:0}: Error finding container f180540322689ea9c55460335d122220724513b7005c672a0d217124f2077e53: Status 404 returned error can't find the container with id f180540322689ea9c55460335d122220724513b7005c672a0d217124f2077e53 Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.184648 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.193245 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.197263 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.207413 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4e62c390f9f97c93004df1cf3672b26b65c2826a6f469021e8392854739254be WatchSource:0}: Error finding container 4e62c390f9f97c93004df1cf3672b26b65c2826a6f469021e8392854739254be: Status 404 returned error can't find the container with id 4e62c390f9f97c93004df1cf3672b26b65c2826a6f469021e8392854739254be Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.227219 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-26e96a06130cae32c3437bbe5d39afe391c0a72be855f23bac72e9cfec5bc973 WatchSource:0}: Error finding container 26e96a06130cae32c3437bbe5d39afe391c0a72be855f23bac72e9cfec5bc973: Status 404 returned error can't find the container with id 26e96a06130cae32c3437bbe5d39afe391c0a72be855f23bac72e9cfec5bc973 Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.410419 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.410556 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.435202 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.436472 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.436539 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.436565 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.436609 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.437320 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.576018 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.576112 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.588858 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.665815 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26e96a06130cae32c3437bbe5d39afe391c0a72be855f23bac72e9cfec5bc973"} Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.667290 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e62c390f9f97c93004df1cf3672b26b65c2826a6f469021e8392854739254be"} Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.668423 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f180540322689ea9c55460335d122220724513b7005c672a0d217124f2077e53"} Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.669319 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b5d9c6d919bd6232633e414fdb9f8f8408034f4cbe2a424d9c0927fc8b25aa23"} Mar 20 12:27:19 crc kubenswrapper[4817]: I0320 12:27:19.670701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9e8fc3300ff711b27ff61fbf8268055dcecb8576ddd06fef6a76f5094b8c3269"} Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.818588 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.818668 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:19 crc kubenswrapper[4817]: W0320 12:27:19.911548 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.911662 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:19 crc kubenswrapper[4817]: E0320 12:27:19.994970 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Mar 20 12:27:20 crc kubenswrapper[4817]: E0320 12:27:20.014438 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8c5dffa81a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,LastTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.237776 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.239066 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.239104 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.239138 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.239170 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:20 crc kubenswrapper[4817]: E0320 12:27:20.239656 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.530319 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 12:27:20 crc kubenswrapper[4817]: E0320 12:27:20.531405 4817 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.589407 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.674923 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759" exitCode=0 Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.675003 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.675106 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.676630 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.676664 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.676675 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.678336 4817 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9bb3bac383955bc69382293b5d366b03f2bf16fbb69040edc0c5ee7c2ed50260" exitCode=0 Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.678458 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.678492 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9bb3bac383955bc69382293b5d366b03f2bf16fbb69040edc0c5ee7c2ed50260"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.678560 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679314 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679365 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679385 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679926 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679957 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.679971 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.681242 4817 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8edd813cc8d0a60dc67a710c940e884854a02b175d8a19ad4226e8af5900f9f4" exitCode=0 Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.681385 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.681395 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8edd813cc8d0a60dc67a710c940e884854a02b175d8a19ad4226e8af5900f9f4"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.683068 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.683156 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.683176 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.684177 4817 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="acfa688c1d9979f5b480ce8598d477bde5fd4217d5e43e359d0a2f1dbfc04bb0" exitCode=0 Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.684255 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"acfa688c1d9979f5b480ce8598d477bde5fd4217d5e43e359d0a2f1dbfc04bb0"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.684325 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.685703 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.685757 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.685778 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.688121 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c928bf0f0a39f57d706f264994b4f66d1fb3bfcdc1c458e30a1ca77cbabd79d5"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.688179 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff03f238e6340189ee78d62a60f90440705bdbac42e3e7a1e1ec58913c84b970"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.688196 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8714a8ff60ef9382ef5a63393a34d3aeeb6ac73694c5ad1729d5362a92a5bcd"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.688211 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d9b567fdd75a25d9edddbbf2a0d25f719ee7e2adc49dd4d46449adbb3266233"} Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.688283 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.689438 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.689480 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:20 crc kubenswrapper[4817]: I0320 12:27:20.689495 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.225633 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.234844 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:21 crc kubenswrapper[4817]: W0320 12:27:21.443423 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:21 crc kubenswrapper[4817]: E0320 12:27:21.443512 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.589280 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:21 crc kubenswrapper[4817]: E0320 12:27:21.596250 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.694231 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.694289 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.694305 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.694325 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.696515 4817 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c4c957dbd3b80ac7d3b889dd3986c88b60c632d05c604322f27143caf07cac1a" exitCode=0 Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.696539 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c4c957dbd3b80ac7d3b889dd3986c88b60c632d05c604322f27143caf07cac1a"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.696617 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.697949 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.697982 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.697994 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.699060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c16dbf1b8bae8058054e7362e09f292c57e61ebd2f1972d2e06c73a2cd0a89f4"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.699202 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.700158 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.700184 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.700196 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.703318 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.703372 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.703452 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e0fb43d15d7b24b67e613b16778615a3ead658ed6800eddcec672dd4b601d2c"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.703475 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8eaf8b7cd990ca656c555c3c0afe7471cc02ddf09cc24ebb2736d5fc834d96ef"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.703489 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4cda0f3b764fd3641bf9c2c992bc181350b9454b250c601b56a0edf7f5f5a0e5"} Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704308 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704329 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704337 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704674 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704716 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.704729 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: W0320 12:27:21.718746 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Mar 20 12:27:21 crc kubenswrapper[4817]: E0320 12:27:21.718827 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.839750 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.843108 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.843200 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.843226 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:21 crc kubenswrapper[4817]: I0320 12:27:21.843324 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:21 crc kubenswrapper[4817]: E0320 12:27:21.844256 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.709679 4817 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eaa3fd9399391e988d6bf1cf4cccf4fba9d82314dee7ed7b35d958e1784d8705" exitCode=0 Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.709770 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.709771 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eaa3fd9399391e988d6bf1cf4cccf4fba9d82314dee7ed7b35d958e1784d8705"} Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.710759 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.710811 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.710821 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.714964 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.714956 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0bcd0dee789500ceb7cccae394cf76a166a5bdb073ef4f86cb269a1409f4953c"} Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715185 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715218 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715249 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715276 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715306 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715687 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715728 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.715740 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717164 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717185 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717194 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717257 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717289 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717298 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717334 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717354 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:22 crc kubenswrapper[4817]: I0320 12:27:22.717306 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.686583 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.722253 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4122a9d5e425a212c4f5dfe6de356e5fbd102665882ecf9c325eb2daa5fd0266"} Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.722360 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"967cf2d9a1ad83a016ab10dfee978b18b7e1177a67bb99adef38d5851c9cd8ed"} Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.722388 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d8a1a879aa026f119248b48e18968ebf4ac0fcee18285b1939a8fac7b493fd17"} Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.722368 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.724037 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.724109 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:23 crc kubenswrapper[4817]: I0320 12:27:23.724198 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.249402 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.418358 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.418579 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.420030 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.420079 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.420096 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.701904 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.730101 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9749c901dcbe3bee344a58edf16e3c9bd9e492ecee53680f46c52823e2b43324"} Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.730203 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02f92bcbea9e24f2f86a419b918840189ccef42b24f780089927cf6b85ca096d"} Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.730210 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.730164 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.734822 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.734889 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.734939 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.737280 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.737335 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:24 crc kubenswrapper[4817]: I0320 12:27:24.737355 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.044918 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.046662 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.046727 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.046747 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.046790 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.733222 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.733279 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.734916 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.734980 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.735033 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.735638 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.735701 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:25 crc kubenswrapper[4817]: I0320 12:27:25.735723 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.052490 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.052671 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.052724 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.054317 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.054393 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.054410 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.122881 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.220667 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.736204 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.736254 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737772 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737828 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737847 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737939 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737956 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:26 crc kubenswrapper[4817]: I0320 12:27:26.737964 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.559403 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.559602 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.559665 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.561985 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.562063 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:27 crc kubenswrapper[4817]: I0320 12:27:27.562084 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:28 crc kubenswrapper[4817]: E0320 12:27:28.738499 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:27:28 crc kubenswrapper[4817]: I0320 12:27:28.983197 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:28 crc kubenswrapper[4817]: I0320 12:27:28.983393 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:28 crc kubenswrapper[4817]: I0320 12:27:28.984732 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:28 crc kubenswrapper[4817]: I0320 12:27:28.984766 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:28 crc kubenswrapper[4817]: I0320 12:27:28.984778 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.053486 4817 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.053585 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.497515 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.497723 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.499183 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.499230 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:29 crc kubenswrapper[4817]: I0320 12:27:29.499246 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:32 crc kubenswrapper[4817]: W0320 12:27:32.300275 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.300402 4817 trace.go:236] Trace[793496527]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 12:27:22.298) (total time: 10001ms): Mar 20 12:27:32 crc kubenswrapper[4817]: Trace[793496527]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:27:32.300) Mar 20 12:27:32 crc kubenswrapper[4817]: Trace[793496527]: [10.001904986s] [10.001904986s] END Mar 20 12:27:32 crc kubenswrapper[4817]: E0320 12:27:32.300431 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.591194 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 12:27:32 crc kubenswrapper[4817]: W0320 12:27:32.728686 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.728833 4817 trace.go:236] Trace[287096859]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 12:27:22.726) (total time: 10001ms): Mar 20 12:27:32 crc kubenswrapper[4817]: Trace[287096859]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:27:32.728) Mar 20 12:27:32 crc kubenswrapper[4817]: Trace[287096859]: [10.0018732s] [10.0018732s] END Mar 20 12:27:32 crc kubenswrapper[4817]: E0320 12:27:32.728896 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.756756 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.759139 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0bcd0dee789500ceb7cccae394cf76a166a5bdb073ef4f86cb269a1409f4953c" exitCode=255 Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.759166 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0bcd0dee789500ceb7cccae394cf76a166a5bdb073ef4f86cb269a1409f4953c"} Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.759382 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.760723 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.760762 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.760772 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:32 crc kubenswrapper[4817]: I0320 12:27:32.761485 4817 scope.go:117] "RemoveContainer" containerID="0bcd0dee789500ceb7cccae394cf76a166a5bdb073ef4f86cb269a1409f4953c" Mar 20 12:27:33 crc kubenswrapper[4817]: W0320 12:27:33.233398 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.233524 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 12:27:33 crc kubenswrapper[4817]: W0320 12:27:33.235041 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.235190 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.243255 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.244503 4817 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.246045 4817 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.246142 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.246208 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8c5dffa81a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,LastTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:33 crc kubenswrapper[4817]: E0320 12:27:33.246633 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.250354 4817 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.250420 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.593072 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:33Z is after 2026-02-23T05:33:13Z Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.765789 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.768282 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d"} Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.768621 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.770381 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.770421 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:33 crc kubenswrapper[4817]: I0320 12:27:33.770431 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.259577 4817 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]log ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]etcd ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-apiextensions-informers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/crd-informer-synced ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/bootstrap-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-registration-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]autoregister-completion ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 12:27:34 crc kubenswrapper[4817]: livez check failed Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.259640 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.593867 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:34Z is after 2026-02-23T05:33:13Z Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.774258 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.774871 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.777617 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" exitCode=255 Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.777701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d"} Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.777825 4817 scope.go:117] "RemoveContainer" containerID="0bcd0dee789500ceb7cccae394cf76a166a5bdb073ef4f86cb269a1409f4953c" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.777976 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.779452 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.779516 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.779534 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:34 crc kubenswrapper[4817]: I0320 12:27:34.780567 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:34 crc kubenswrapper[4817]: E0320 12:27:34.780906 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:35 crc kubenswrapper[4817]: I0320 12:27:35.592643 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:35Z is after 2026-02-23T05:33:13Z Mar 20 12:27:35 crc kubenswrapper[4817]: I0320 12:27:35.783542 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.170275 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.170461 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.171755 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.171791 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.171801 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.183507 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.593551 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:36Z is after 2026-02-23T05:33:13Z Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.789703 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.790960 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.791031 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:36 crc kubenswrapper[4817]: I0320 12:27:36.791053 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:37 crc kubenswrapper[4817]: W0320 12:27:37.454710 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:37Z is after 2026-02-23T05:33:13Z Mar 20 12:27:37 crc kubenswrapper[4817]: E0320 12:27:37.454785 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.592938 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:37Z is after 2026-02-23T05:33:13Z Mar 20 12:27:37 crc kubenswrapper[4817]: W0320 12:27:37.637545 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:37Z is after 2026-02-23T05:33:13Z Mar 20 12:27:37 crc kubenswrapper[4817]: E0320 12:27:37.637642 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.759990 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.760338 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.761776 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.761846 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.761868 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:37 crc kubenswrapper[4817]: I0320 12:27:37.762864 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:37 crc kubenswrapper[4817]: E0320 12:27:37.763283 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.591672 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:38Z is after 2026-02-23T05:33:13Z Mar 20 12:27:38 crc kubenswrapper[4817]: E0320 12:27:38.738573 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.990573 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.990826 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.992621 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.992680 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:38 crc kubenswrapper[4817]: I0320 12:27:38.992697 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.053502 4817 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.053588 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.256242 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.256407 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.257602 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.257650 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.257666 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.258358 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:39 crc kubenswrapper[4817]: E0320 12:27:39.258565 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.263152 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.594280 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:39Z is after 2026-02-23T05:33:13Z Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.646874 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.648578 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.648633 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.648651 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.648684 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:39 crc kubenswrapper[4817]: E0320 12:27:39.651047 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:39Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 12:27:39 crc kubenswrapper[4817]: E0320 12:27:39.656098 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:27:39Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.796518 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.798029 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.798084 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.798102 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:39 crc kubenswrapper[4817]: I0320 12:27:39.799174 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:39 crc kubenswrapper[4817]: E0320 12:27:39.799492 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:40 crc kubenswrapper[4817]: I0320 12:27:40.595616 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:41 crc kubenswrapper[4817]: I0320 12:27:41.424629 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 12:27:41 crc kubenswrapper[4817]: I0320 12:27:41.443987 4817 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 12:27:41 crc kubenswrapper[4817]: I0320 12:27:41.596383 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:42 crc kubenswrapper[4817]: W0320 12:27:42.331311 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 12:27:42 crc kubenswrapper[4817]: E0320 12:27:42.331378 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 12:27:42 crc kubenswrapper[4817]: I0320 12:27:42.596506 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.253888 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5dffa81a55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,LastTimestamp:2026-03-20 12:27:18.579608149 +0000 UTC m=+0.667920962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.260209 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.265791 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.271565 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.278857 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e08970087 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.729482375 +0000 UTC m=+0.817795168,LastTimestamp:2026-03-20 12:27:18.729482375 +0000 UTC m=+0.817795168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.284674 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.764673339 +0000 UTC m=+0.852986132,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.290349 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.76470424 +0000 UTC m=+0.853017033,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.295367 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.764718661 +0000 UTC m=+0.853031454,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.302508 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.7659298 +0000 UTC m=+0.854242593,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.307966 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.765942221 +0000 UTC m=+0.854255014,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.313695 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.765952681 +0000 UTC m=+0.854265474,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.321034 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.766045944 +0000 UTC m=+0.854358737,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.328098 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.766063925 +0000 UTC m=+0.854376718,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.334180 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.766077615 +0000 UTC m=+0.854390418,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.338650 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.767235073 +0000 UTC m=+0.855547866,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.342025 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.767262314 +0000 UTC m=+0.855575107,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.343804 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.767278554 +0000 UTC m=+0.855591347,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.348740 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.767366347 +0000 UTC m=+0.855679140,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.352423 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.76745135 +0000 UTC m=+0.855764153,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.356393 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.767482101 +0000 UTC m=+0.855794894,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.361041 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.76804405 +0000 UTC m=+0.856356843,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.365002 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.76806459 +0000 UTC m=+0.856377383,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.369931 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cdf6b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cdf6b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653321067 +0000 UTC m=+0.741633870,LastTimestamp:2026-03-20 12:27:18.768075101 +0000 UTC m=+0.856387894,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.374110 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040c7b9c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040c7b9c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653295516 +0000 UTC m=+0.741608309,LastTimestamp:2026-03-20 12:27:18.768860376 +0000 UTC m=+0.857173169,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.378622 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8c5e040cb63e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8c5e040cb63e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:18.653310526 +0000 UTC m=+0.741623329,LastTimestamp:2026-03-20 12:27:18.768871967 +0000 UTC m=+0.857184760,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.386456 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e2300fdec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.17263614 +0000 UTC m=+1.260948963,LastTimestamp:2026-03-20 12:27:19.17263614 +0000 UTC m=+1.260948963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.390453 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e23e55c6d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.187602541 +0000 UTC m=+1.275915364,LastTimestamp:2026-03-20 12:27:19.187602541 +0000 UTC m=+1.275915364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.394948 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e23e60dd0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.187647952 +0000 UTC m=+1.275960775,LastTimestamp:2026-03-20 12:27:19.187647952 +0000 UTC m=+1.275960775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.402270 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e259c5ba5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.216372645 +0000 UTC m=+1.304685438,LastTimestamp:2026-03-20 12:27:19.216372645 +0000 UTC m=+1.304685438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.409401 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e2672167c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.230379644 +0000 UTC m=+1.318692437,LastTimestamp:2026-03-20 12:27:19.230379644 +0000 UTC m=+1.318692437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.417490 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e4534dbd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.746460631 +0000 UTC m=+1.834773414,LastTimestamp:2026-03-20 12:27:19.746460631 +0000 UTC m=+1.834773414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.421430 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e45cab767 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.756281703 +0000 UTC m=+1.844594486,LastTimestamp:2026-03-20 12:27:19.756281703 +0000 UTC m=+1.844594486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.425758 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e45e32f70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.757885296 +0000 UTC m=+1.846198079,LastTimestamp:2026-03-20 12:27:19.757885296 +0000 UTC m=+1.846198079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.430361 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e46bde084 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.772217476 +0000 UTC m=+1.860530259,LastTimestamp:2026-03-20 12:27:19.772217476 +0000 UTC m=+1.860530259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.434873 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e46f49be0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.775804384 +0000 UTC m=+1.864117167,LastTimestamp:2026-03-20 12:27:19.775804384 +0000 UTC m=+1.864117167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.439959 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e470f7255 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.777563221 +0000 UTC m=+1.865876004,LastTimestamp:2026-03-20 12:27:19.777563221 +0000 UTC m=+1.865876004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.443326 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e47191234 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.778193972 +0000 UTC m=+1.866506755,LastTimestamp:2026-03-20 12:27:19.778193972 +0000 UTC m=+1.866506755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.444993 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e479ae9cf openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.786703311 +0000 UTC m=+1.875016094,LastTimestamp:2026-03-20 12:27:19.786703311 +0000 UTC m=+1.875016094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.448312 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e479c3ada openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.786789594 +0000 UTC m=+1.875102377,LastTimestamp:2026-03-20 12:27:19.786789594 +0000 UTC m=+1.875102377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.451819 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e480a7191 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.794012561 +0000 UTC m=+1.882325334,LastTimestamp:2026-03-20 12:27:19.794012561 +0000 UTC m=+1.882325334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.455277 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e48338179 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.796703609 +0000 UTC m=+1.885016392,LastTimestamp:2026-03-20 12:27:19.796703609 +0000 UTC m=+1.885016392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.458581 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e591ac002 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.08029389 +0000 UTC m=+2.168606713,LastTimestamp:2026-03-20 12:27:20.08029389 +0000 UTC m=+2.168606713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.462626 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e5a00d5fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.095372794 +0000 UTC m=+2.183685607,LastTimestamp:2026-03-20 12:27:20.095372794 +0000 UTC m=+2.183685607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.466083 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e5a1cf0a1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.097214625 +0000 UTC m=+2.185527438,LastTimestamp:2026-03-20 12:27:20.097214625 +0000 UTC m=+2.185527438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.469768 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e6818967d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.331810429 +0000 UTC m=+2.420123212,LastTimestamp:2026-03-20 12:27:20.331810429 +0000 UTC m=+2.420123212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.472708 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e6905212c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.347312428 +0000 UTC m=+2.435625221,LastTimestamp:2026-03-20 12:27:20.347312428 +0000 UTC m=+2.435625221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.475895 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e6912fe3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.348220987 +0000 UTC m=+2.436533780,LastTimestamp:2026-03-20 12:27:20.348220987 +0000 UTC m=+2.436533780,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.480042 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e7638a35e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.568791902 +0000 UTC m=+2.657104725,LastTimestamp:2026-03-20 12:27:20.568791902 +0000 UTC m=+2.657104725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.484091 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e76fc778f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.581625743 +0000 UTC m=+2.669938546,LastTimestamp:2026-03-20 12:27:20.581625743 +0000 UTC m=+2.669938546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.487883 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e7cbfbe62 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.678309474 +0000 UTC m=+2.766622267,LastTimestamp:2026-03-20 12:27:20.678309474 +0000 UTC m=+2.766622267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.493571 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e7cf30218 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.681669144 +0000 UTC m=+2.769981937,LastTimestamp:2026-03-20 12:27:20.681669144 +0000 UTC m=+2.769981937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.498388 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e7d25e3ae openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.685003694 +0000 UTC m=+2.773316527,LastTimestamp:2026-03-20 12:27:20.685003694 +0000 UTC m=+2.773316527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.505620 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e7d772a74 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.690330228 +0000 UTC m=+2.778643021,LastTimestamp:2026-03-20 12:27:20.690330228 +0000 UTC m=+2.778643021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.509984 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e8a445b91 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.905104273 +0000 UTC m=+2.993417056,LastTimestamp:2026-03-20 12:27:20.905104273 +0000 UTC m=+2.993417056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.513333 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e8a7c7233 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.908780083 +0000 UTC m=+2.997092866,LastTimestamp:2026-03-20 12:27:20.908780083 +0000 UTC m=+2.997092866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.519478 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e8a7ce5f2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.908809714 +0000 UTC m=+2.997122497,LastTimestamp:2026-03-20 12:27:20.908809714 +0000 UTC m=+2.997122497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.523848 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e8a7dee23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.908877347 +0000 UTC m=+2.997190130,LastTimestamp:2026-03-20 12:27:20.908877347 +0000 UTC m=+2.997190130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.528307 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e8b586bc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.923196356 +0000 UTC m=+3.011509139,LastTimestamp:2026-03-20 12:27:20.923196356 +0000 UTC m=+3.011509139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.533451 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e8b652a56 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.924031574 +0000 UTC m=+3.012344357,LastTimestamp:2026-03-20 12:27:20.924031574 +0000 UTC m=+3.012344357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.538573 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e8b6dfaef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.924609263 +0000 UTC m=+3.012922046,LastTimestamp:2026-03-20 12:27:20.924609263 +0000 UTC m=+3.012922046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.542987 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8c5e8b6e0c51 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.924613713 +0000 UTC m=+3.012926506,LastTimestamp:2026-03-20 12:27:20.924613713 +0000 UTC m=+3.012926506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.551357 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5e8b7575b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.925099449 +0000 UTC m=+3.013412232,LastTimestamp:2026-03-20 12:27:20.925099449 +0000 UTC m=+3.013412232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.557418 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e8b777f93 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.925233043 +0000 UTC m=+3.013545816,LastTimestamp:2026-03-20 12:27:20.925233043 +0000 UTC m=+3.013545816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.562136 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e974d451c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.123792156 +0000 UTC m=+3.212104949,LastTimestamp:2026-03-20 12:27:21.123792156 +0000 UTC m=+3.212104949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.566832 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e9750452e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.123988782 +0000 UTC m=+3.212301585,LastTimestamp:2026-03-20 12:27:21.123988782 +0000 UTC m=+3.212301585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.573117 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e9859af6f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.141383023 +0000 UTC m=+3.229695826,LastTimestamp:2026-03-20 12:27:21.141383023 +0000 UTC m=+3.229695826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.579412 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5e98ef3036 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.151180854 +0000 UTC m=+3.239493637,LastTimestamp:2026-03-20 12:27:21.151180854 +0000 UTC m=+3.239493637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.585297 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e9910e5f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.153390067 +0000 UTC m=+3.241702860,LastTimestamp:2026-03-20 12:27:21.153390067 +0000 UTC m=+3.241702860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.591172 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.591934 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5e9928c606 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.154954758 +0000 UTC m=+3.243267541,LastTimestamp:2026-03-20 12:27:21.154954758 +0000 UTC m=+3.243267541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.595304 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ea46cf8ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.343973578 +0000 UTC m=+3.432286361,LastTimestamp:2026-03-20 12:27:21.343973578 +0000 UTC m=+3.432286361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.601807 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5ea4b0a179 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.348407673 +0000 UTC m=+3.436720456,LastTimestamp:2026-03-20 12:27:21.348407673 +0000 UTC m=+3.436720456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.605858 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ea5963c3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.363455037 +0000 UTC m=+3.451767820,LastTimestamp:2026-03-20 12:27:21.363455037 +0000 UTC m=+3.451767820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.612152 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ea5ab6bc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.364843462 +0000 UTC m=+3.453156255,LastTimestamp:2026-03-20 12:27:21.364843462 +0000 UTC m=+3.453156255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.618402 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8c5ea5d75050 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.367720016 +0000 UTC m=+3.456032809,LastTimestamp:2026-03-20 12:27:21.367720016 +0000 UTC m=+3.456032809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.626233 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5eb27f0298 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.58003676 +0000 UTC m=+3.668349543,LastTimestamp:2026-03-20 12:27:21.58003676 +0000 UTC m=+3.668349543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.632725 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5eb33e0cf7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.592556791 +0000 UTC m=+3.680869584,LastTimestamp:2026-03-20 12:27:21.592556791 +0000 UTC m=+3.680869584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.637507 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5eb34d4238 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.593553464 +0000 UTC m=+3.681866247,LastTimestamp:2026-03-20 12:27:21.593553464 +0000 UTC m=+3.681866247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.642775 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5eb9a0daab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.699695275 +0000 UTC m=+3.788008058,LastTimestamp:2026-03-20 12:27:21.699695275 +0000 UTC m=+3.788008058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.649327 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ec0812eb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.815060149 +0000 UTC m=+3.903372932,LastTimestamp:2026-03-20 12:27:21.815060149 +0000 UTC m=+3.903372932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.653906 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ec1108635 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.824454197 +0000 UTC m=+3.912766980,LastTimestamp:2026-03-20 12:27:21.824454197 +0000 UTC m=+3.912766980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.657073 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5ec7d5c3a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.938043813 +0000 UTC m=+4.026356586,LastTimestamp:2026-03-20 12:27:21.938043813 +0000 UTC m=+4.026356586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.660191 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5ec8ab465a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.952036442 +0000 UTC m=+4.040349225,LastTimestamp:2026-03-20 12:27:21.952036442 +0000 UTC m=+4.040349225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.661892 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5ef5fb1512 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:22.712241426 +0000 UTC m=+4.800554209,LastTimestamp:2026-03-20 12:27:22.712241426 +0000 UTC m=+4.800554209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.666541 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f04ff411c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:22.964173084 +0000 UTC m=+5.052485887,LastTimestamp:2026-03-20 12:27:22.964173084 +0000 UTC m=+5.052485887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.671916 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f05c3ee97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:22.977062551 +0000 UTC m=+5.065375354,LastTimestamp:2026-03-20 12:27:22.977062551 +0000 UTC m=+5.065375354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.677326 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f05db2929 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:22.978584873 +0000 UTC m=+5.066897666,LastTimestamp:2026-03-20 12:27:22.978584873 +0000 UTC m=+5.066897666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.683055 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f1341c0bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.203412159 +0000 UTC m=+5.291724992,LastTimestamp:2026-03-20 12:27:23.203412159 +0000 UTC m=+5.291724992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.687643 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.687863 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.689256 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.689293 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.689338 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.689567 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f140e7bb1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.216829361 +0000 UTC m=+5.305142184,LastTimestamp:2026-03-20 12:27:23.216829361 +0000 UTC m=+5.305142184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: I0320 12:27:43.690029 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.690265 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.693330 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f1425d35e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.218359134 +0000 UTC m=+5.306671957,LastTimestamp:2026-03-20 12:27:23.218359134 +0000 UTC m=+5.306671957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.697348 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f228a5f2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.459829551 +0000 UTC m=+5.548142334,LastTimestamp:2026-03-20 12:27:23.459829551 +0000 UTC m=+5.548142334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.701011 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f23402f72 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.471744882 +0000 UTC m=+5.560057705,LastTimestamp:2026-03-20 12:27:23.471744882 +0000 UTC m=+5.560057705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.705643 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f23562dc2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.473186242 +0000 UTC m=+5.561499065,LastTimestamp:2026-03-20 12:27:23.473186242 +0000 UTC m=+5.561499065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.709860 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f32a321d5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.729887701 +0000 UTC m=+5.818200514,LastTimestamp:2026-03-20 12:27:23.729887701 +0000 UTC m=+5.818200514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.713544 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f334cbc9d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.741002909 +0000 UTC m=+5.829315692,LastTimestamp:2026-03-20 12:27:23.741002909 +0000 UTC m=+5.829315692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.718623 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f335cad6e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.742047598 +0000 UTC m=+5.830360391,LastTimestamp:2026-03-20 12:27:23.742047598 +0000 UTC m=+5.830360391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.723025 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f40d0991a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.967748378 +0000 UTC m=+6.056061161,LastTimestamp:2026-03-20 12:27:23.967748378 +0000 UTC m=+6.056061161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.727616 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8c5f41d7283f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:23.984955455 +0000 UTC m=+6.073268258,LastTimestamp:2026-03-20 12:27:23.984955455 +0000 UTC m=+6.073268258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.733946 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 12:27:43 crc kubenswrapper[4817]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8c606ff3d787 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 12:27:43 crc kubenswrapper[4817]: body: Mar 20 12:27:43 crc kubenswrapper[4817]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:29.053554567 +0000 UTC m=+11.141867380,LastTimestamp:2026-03-20 12:27:29.053554567 +0000 UTC m=+11.141867380,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 12:27:43 crc kubenswrapper[4817]: > Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.739677 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c606ff4e471 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:29.053623409 +0000 UTC m=+11.141936232,LastTimestamp:2026-03-20 12:27:29.053623409 +0000 UTC m=+11.141936232,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.743943 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8c5eb34d4238\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5eb34d4238 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.593553464 +0000 UTC m=+3.681866247,LastTimestamp:2026-03-20 12:27:32.76257955 +0000 UTC m=+14.850892323,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.748208 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8c5ec0812eb5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ec0812eb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.815060149 +0000 UTC m=+3.903372932,LastTimestamp:2026-03-20 12:27:32.960720896 +0000 UTC m=+15.049033679,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.754225 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8c5ec1108635\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c5ec1108635 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:21.824454197 +0000 UTC m=+3.912766980,LastTimestamp:2026-03-20 12:27:32.971616278 +0000 UTC m=+15.059929061,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.761107 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 12:27:43 crc kubenswrapper[4817]: &Event{ObjectMeta:{kube-apiserver-crc.189e8c6169d92675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 12:27:43 crc kubenswrapper[4817]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 12:27:43 crc kubenswrapper[4817]: Mar 20 12:27:43 crc kubenswrapper[4817]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:33.246109301 +0000 UTC m=+15.334422084,LastTimestamp:2026-03-20 12:27:33.246109301 +0000 UTC m=+15.334422084,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 12:27:43 crc kubenswrapper[4817]: > Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.765263 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c6169da224d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:33.246173773 +0000 UTC m=+15.334486556,LastTimestamp:2026-03-20 12:27:33.246173773 +0000 UTC m=+15.334486556,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.769085 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8c6169d92675\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 12:27:43 crc kubenswrapper[4817]: &Event{ObjectMeta:{kube-apiserver-crc.189e8c6169d92675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 12:27:43 crc kubenswrapper[4817]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 12:27:43 crc kubenswrapper[4817]: Mar 20 12:27:43 crc kubenswrapper[4817]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:33.246109301 +0000 UTC m=+15.334422084,LastTimestamp:2026-03-20 12:27:33.25039576 +0000 UTC m=+15.338708543,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 12:27:43 crc kubenswrapper[4817]: > Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.773004 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8c6169da224d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8c6169da224d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:33.246173773 +0000 UTC m=+15.334486556,LastTimestamp:2026-03-20 12:27:33.250444651 +0000 UTC m=+15.338757434,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.777083 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 12:27:43 crc kubenswrapper[4817]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8c62c3ffed0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 12:27:43 crc kubenswrapper[4817]: body: Mar 20 12:27:43 crc kubenswrapper[4817]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:39.053567244 +0000 UTC m=+21.141880067,LastTimestamp:2026-03-20 12:27:39.053567244 +0000 UTC m=+21.141880067,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 12:27:43 crc kubenswrapper[4817]: > Mar 20 12:27:43 crc kubenswrapper[4817]: E0320 12:27:43.780180 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c62c400d1a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:39.053625766 +0000 UTC m=+21.141938589,LastTimestamp:2026-03-20 12:27:39.053625766 +0000 UTC m=+21.141938589,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:44 crc kubenswrapper[4817]: W0320 12:27:44.009841 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:44 crc kubenswrapper[4817]: E0320 12:27:44.009948 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 12:27:44 crc kubenswrapper[4817]: I0320 12:27:44.597878 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:44 crc kubenswrapper[4817]: W0320 12:27:44.849504 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 12:27:44 crc kubenswrapper[4817]: E0320 12:27:44.849585 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 12:27:45 crc kubenswrapper[4817]: I0320 12:27:45.596766 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.596227 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.657226 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.658991 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.659026 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.659038 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:46 crc kubenswrapper[4817]: I0320 12:27:46.659059 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:46 crc kubenswrapper[4817]: E0320 12:27:46.661512 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:27:46 crc kubenswrapper[4817]: E0320 12:27:46.668094 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:27:47 crc kubenswrapper[4817]: I0320 12:27:47.594668 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:48 crc kubenswrapper[4817]: I0320 12:27:48.593918 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:48 crc kubenswrapper[4817]: E0320 12:27:48.738706 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.052979 4817 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.053080 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.053207 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.053417 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.054999 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.055072 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.055085 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.055851 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d8714a8ff60ef9382ef5a63393a34d3aeeb6ac73694c5ad1729d5362a92a5bcd"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.056014 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d8714a8ff60ef9382ef5a63393a34d3aeeb6ac73694c5ad1729d5362a92a5bcd" gracePeriod=30 Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.060738 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8c62c3ffed0c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 12:27:49 crc kubenswrapper[4817]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8c62c3ffed0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 12:27:49 crc kubenswrapper[4817]: body: Mar 20 12:27:49 crc kubenswrapper[4817]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:39.053567244 +0000 UTC m=+21.141880067,LastTimestamp:2026-03-20 12:27:49.053056814 +0000 UTC m=+31.141369627,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 12:27:49 crc kubenswrapper[4817]: > Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.065862 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8c62c400d1a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c62c400d1a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:39.053625766 +0000 UTC m=+21.141938589,LastTimestamp:2026-03-20 12:27:49.053170458 +0000 UTC m=+31.141483271,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.072997 4817 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c651830e822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:49.055997986 +0000 UTC m=+31.144310769,LastTimestamp:2026-03-20 12:27:49.055997986 +0000 UTC m=+31.144310769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.179628 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8c5e45e32f70\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e45e32f70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:19.757885296 +0000 UTC m=+1.846198079,LastTimestamp:2026-03-20 12:27:49.172504387 +0000 UTC m=+31.260817190,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:49 crc kubenswrapper[4817]: W0320 12:27:49.411620 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.411701 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.442060 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8c5e591ac002\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e591ac002 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.08029389 +0000 UTC m=+2.168606713,LastTimestamp:2026-03-20 12:27:49.43436986 +0000 UTC m=+31.522682683,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:49 crc kubenswrapper[4817]: E0320 12:27:49.452457 4817 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8c5e5a00d5fa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8c5e5a00d5fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:27:20.095372794 +0000 UTC m=+2.183685607,LastTimestamp:2026-03-20 12:27:49.450240161 +0000 UTC m=+31.538552984,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.595232 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.830935 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.831323 4817 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d8714a8ff60ef9382ef5a63393a34d3aeeb6ac73694c5ad1729d5362a92a5bcd" exitCode=255 Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.831376 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d8714a8ff60ef9382ef5a63393a34d3aeeb6ac73694c5ad1729d5362a92a5bcd"} Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.831423 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3d7e7084905b22c9220548d39c340bf7cc3a11e5be9e10a6f4e7525947efe2b5"} Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.831548 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.832604 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.832650 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:49 crc kubenswrapper[4817]: I0320 12:27:49.832666 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:50 crc kubenswrapper[4817]: I0320 12:27:50.595171 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:51 crc kubenswrapper[4817]: I0320 12:27:51.593637 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:52 crc kubenswrapper[4817]: I0320 12:27:52.593775 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.593742 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:53 crc kubenswrapper[4817]: E0320 12:27:53.667267 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.668325 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.670163 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.670211 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.670230 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:53 crc kubenswrapper[4817]: I0320 12:27:53.670269 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:27:53 crc kubenswrapper[4817]: E0320 12:27:53.678215 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:27:54 crc kubenswrapper[4817]: I0320 12:27:54.593240 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:55 crc kubenswrapper[4817]: I0320 12:27:55.597345 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.053528 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.053875 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.055961 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.056038 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.056065 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.063905 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.593496 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.663225 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.664796 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.664900 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.664915 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.665877 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.852110 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.852231 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.853107 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.853174 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:56 crc kubenswrapper[4817]: I0320 12:27:56.853187 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:57 crc kubenswrapper[4817]: W0320 12:27:57.516673 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:57 crc kubenswrapper[4817]: E0320 12:27:57.516943 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.592234 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.856065 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.857774 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f"} Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.857863 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.857962 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858895 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858905 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858928 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858940 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858943 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:57 crc kubenswrapper[4817]: I0320 12:27:57.858961 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.592689 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:58 crc kubenswrapper[4817]: E0320 12:27:58.738822 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.862648 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.863842 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.866484 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" exitCode=255 Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.866544 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f"} Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.866603 4817 scope.go:117] "RemoveContainer" containerID="0c602cdc9e81e3dd9081444fa0686992ae42beeed0d12d7687a74702e9c7356d" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.866859 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.869231 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.869327 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.869390 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:27:58 crc kubenswrapper[4817]: I0320 12:27:58.869943 4817 scope.go:117] "RemoveContainer" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" Mar 20 12:27:58 crc kubenswrapper[4817]: E0320 12:27:58.870188 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:27:59 crc kubenswrapper[4817]: I0320 12:27:59.593603 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:27:59 crc kubenswrapper[4817]: I0320 12:27:59.869905 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.593493 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:00 crc kubenswrapper[4817]: E0320 12:28:00.674482 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.678493 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.679723 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.679960 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.680407 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:00 crc kubenswrapper[4817]: I0320 12:28:00.680451 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:28:00 crc kubenswrapper[4817]: E0320 12:28:00.684816 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:28:01 crc kubenswrapper[4817]: I0320 12:28:01.592658 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:02 crc kubenswrapper[4817]: I0320 12:28:02.592873 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:02 crc kubenswrapper[4817]: W0320 12:28:02.794856 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 12:28:02 crc kubenswrapper[4817]: E0320 12:28:02.794942 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 12:28:03 crc kubenswrapper[4817]: W0320 12:28:03.471697 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 12:28:03 crc kubenswrapper[4817]: E0320 12:28:03.471972 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.596588 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.687501 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.687654 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.688666 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.688706 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.688716 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:03 crc kubenswrapper[4817]: I0320 12:28:03.689217 4817 scope.go:117] "RemoveContainer" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" Mar 20 12:28:03 crc kubenswrapper[4817]: E0320 12:28:03.689363 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:04 crc kubenswrapper[4817]: W0320 12:28:04.242427 4817 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 12:28:04 crc kubenswrapper[4817]: E0320 12:28:04.242478 4817 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 12:28:04 crc kubenswrapper[4817]: I0320 12:28:04.593216 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:05 crc kubenswrapper[4817]: I0320 12:28:05.594805 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:06 crc kubenswrapper[4817]: I0320 12:28:06.595030 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.566488 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.566711 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.568419 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.568480 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.568508 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.594682 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:07 crc kubenswrapper[4817]: E0320 12:28:07.679857 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.685701 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.687144 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.687197 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.687216 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.687256 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:28:07 crc kubenswrapper[4817]: E0320 12:28:07.691271 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.760501 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.760728 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.762030 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.762061 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.762070 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:07 crc kubenswrapper[4817]: I0320 12:28:07.762551 4817 scope.go:117] "RemoveContainer" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" Mar 20 12:28:07 crc kubenswrapper[4817]: E0320 12:28:07.762782 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:08 crc kubenswrapper[4817]: I0320 12:28:08.593005 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:08 crc kubenswrapper[4817]: E0320 12:28:08.738938 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:28:09 crc kubenswrapper[4817]: I0320 12:28:09.593940 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:10 crc kubenswrapper[4817]: I0320 12:28:10.597172 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:11 crc kubenswrapper[4817]: I0320 12:28:11.596377 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:12 crc kubenswrapper[4817]: I0320 12:28:12.595566 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:13 crc kubenswrapper[4817]: I0320 12:28:13.595354 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.425847 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.426032 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.427805 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.427857 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.427871 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.595383 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:14 crc kubenswrapper[4817]: E0320 12:28:14.686972 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.691389 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.692570 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.692641 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.692664 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:14 crc kubenswrapper[4817]: I0320 12:28:14.692706 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:28:14 crc kubenswrapper[4817]: E0320 12:28:14.698481 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:28:15 crc kubenswrapper[4817]: I0320 12:28:15.597727 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:16 crc kubenswrapper[4817]: I0320 12:28:16.597163 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:17 crc kubenswrapper[4817]: I0320 12:28:17.595045 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:18 crc kubenswrapper[4817]: I0320 12:28:18.596969 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:18 crc kubenswrapper[4817]: E0320 12:28:18.739104 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.594730 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.663005 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.664660 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.664774 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.664795 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.665744 4817 scope.go:117] "RemoveContainer" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.946625 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.949169 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823"} Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.949299 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.950475 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.950506 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:19 crc kubenswrapper[4817]: I0320 12:28:19.950518 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.596641 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.953488 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.954028 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.955664 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" exitCode=255 Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.955712 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823"} Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.955757 4817 scope.go:117] "RemoveContainer" containerID="71413a1b5d1dddab7dc4539e905bef27a38f4c8abf75d13dbe547cf8a895de8f" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.955958 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.957280 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.957330 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.957347 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:20 crc kubenswrapper[4817]: I0320 12:28:20.957990 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:20 crc kubenswrapper[4817]: E0320 12:28:20.958433 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.597549 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.698854 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:21 crc kubenswrapper[4817]: E0320 12:28:21.699360 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.701907 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.702016 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.702037 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.702070 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:28:21 crc kubenswrapper[4817]: E0320 12:28:21.709144 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 12:28:21 crc kubenswrapper[4817]: I0320 12:28:21.961551 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 12:28:22 crc kubenswrapper[4817]: I0320 12:28:22.594933 4817 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.234064 4817 csr.go:261] certificate signing request csr-2tkbn is approved, waiting to be issued Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.241084 4817 csr.go:257] certificate signing request csr-2tkbn is issued Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.344340 4817 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.413624 4817 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.687208 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.687431 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.688958 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.688998 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.689006 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:23 crc kubenswrapper[4817]: I0320 12:28:23.689620 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:23 crc kubenswrapper[4817]: E0320 12:28:23.689781 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:24 crc kubenswrapper[4817]: I0320 12:28:24.242542 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-14 01:24:14.517352363 +0000 UTC Mar 20 12:28:24 crc kubenswrapper[4817]: I0320 12:28:24.242606 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5724h55m50.274751461s for next certificate rotation Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.760373 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.760565 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.761968 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.762052 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.762092 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:27 crc kubenswrapper[4817]: I0320 12:28:27.763332 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:27 crc kubenswrapper[4817]: E0320 12:28:27.763635 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.709947 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.711414 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.711456 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.711467 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.711582 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.722112 4817 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.722509 4817 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.722548 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.727364 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.727433 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.727454 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.727481 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.727497 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:28Z","lastTransitionTime":"2026-03-20T12:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.739794 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.749434 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.760045 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.760158 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.760182 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.760212 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.760235 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:28Z","lastTransitionTime":"2026-03-20T12:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.773880 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.783004 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.783046 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.783056 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.783074 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.783085 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:28Z","lastTransitionTime":"2026-03-20T12:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.793884 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.801154 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.801207 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.801221 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.801240 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:28 crc kubenswrapper[4817]: I0320 12:28:28.801253 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:28Z","lastTransitionTime":"2026-03-20T12:28:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.811621 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.811781 4817 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.811815 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:28 crc kubenswrapper[4817]: E0320 12:28:28.912947 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.013283 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.113657 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.213799 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.314168 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.414768 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.515571 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.615743 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.716206 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.817230 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:29 crc kubenswrapper[4817]: E0320 12:28:29.918044 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.019140 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.120261 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.379745 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.480038 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.580276 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: I0320 12:28:30.642919 4817 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.680763 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.781232 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.882343 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:30 crc kubenswrapper[4817]: E0320 12:28:30.982466 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.083313 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.184070 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.284466 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.385078 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.485490 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.586597 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.687225 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.787962 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.889109 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:31 crc kubenswrapper[4817]: E0320 12:28:31.990025 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: I0320 12:28:32.056347 4817 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.090716 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.190869 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.291382 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.392156 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.492359 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.592701 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.692871 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.793487 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.893804 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:32 crc kubenswrapper[4817]: E0320 12:28:32.994346 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.095018 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.195887 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.296670 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.396973 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.497837 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.599041 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.700082 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.800595 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:33 crc kubenswrapper[4817]: E0320 12:28:33.901732 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.002098 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.102783 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.203319 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.303683 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.404760 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.504906 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.605806 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.705945 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.806509 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:34 crc kubenswrapper[4817]: E0320 12:28:34.907055 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.007947 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.108731 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.210038 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.310938 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.411948 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.512589 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.612954 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.713186 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.814226 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:35 crc kubenswrapper[4817]: E0320 12:28:35.914737 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.015788 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.116402 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.217028 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.318185 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.419263 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.519844 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.620692 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.720824 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.821797 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:36 crc kubenswrapper[4817]: E0320 12:28:36.922357 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.022869 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.123444 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.224558 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.325823 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.426687 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.527239 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.628326 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.729046 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.829588 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:37 crc kubenswrapper[4817]: E0320 12:28:37.930216 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.030966 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.132156 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.233699 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.334650 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.435238 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.535891 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.636384 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.736532 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.740894 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.837626 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.938410 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.965855 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.970830 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.971243 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.971417 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.971572 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.971702 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:38Z","lastTransitionTime":"2026-03-20T12:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:38 crc kubenswrapper[4817]: E0320 12:28:38.987586 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.991609 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.991645 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.991657 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.991675 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:38 crc kubenswrapper[4817]: I0320 12:28:38.991687 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:38Z","lastTransitionTime":"2026-03-20T12:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.005989 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.010196 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.010263 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.010278 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.010292 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.010304 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:39Z","lastTransitionTime":"2026-03-20T12:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.019669 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.023355 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.023396 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.023413 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.023429 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:39 crc kubenswrapper[4817]: I0320 12:28:39.023440 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:39Z","lastTransitionTime":"2026-03-20T12:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.041572 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.041932 4817 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.042030 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.142540 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.243035 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.344187 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.445338 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.546410 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.647545 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.748659 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.849801 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:39 crc kubenswrapper[4817]: E0320 12:28:39.949943 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.050180 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.151243 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.251751 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.352339 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.452723 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.552946 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.654075 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.754682 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.855210 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:40 crc kubenswrapper[4817]: E0320 12:28:40.955793 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.056735 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.157071 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.258191 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.358362 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.458469 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.559008 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.659930 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: I0320 12:28:41.663499 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:41 crc kubenswrapper[4817]: I0320 12:28:41.665233 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:41 crc kubenswrapper[4817]: I0320 12:28:41.665296 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:41 crc kubenswrapper[4817]: I0320 12:28:41.665320 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:41 crc kubenswrapper[4817]: I0320 12:28:41.666374 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.666707 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.760859 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.861493 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:41 crc kubenswrapper[4817]: E0320 12:28:41.961647 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.062180 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.162482 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.262680 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.363222 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.464030 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.565001 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.665419 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.765571 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.865894 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:42 crc kubenswrapper[4817]: E0320 12:28:42.966226 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.066320 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.167152 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.267556 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.368633 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.469554 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.570035 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.671180 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.771326 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.871467 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:43 crc kubenswrapper[4817]: E0320 12:28:43.972650 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.073084 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.174071 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.275189 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.376068 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.476403 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.577086 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: I0320 12:28:44.662838 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 12:28:44 crc kubenswrapper[4817]: I0320 12:28:44.663781 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:44 crc kubenswrapper[4817]: I0320 12:28:44.663807 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:44 crc kubenswrapper[4817]: I0320 12:28:44.663815 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.678181 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.779324 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.880608 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:44 crc kubenswrapper[4817]: E0320 12:28:44.981614 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.081945 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.182588 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.283673 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.384269 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.485171 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.586150 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.687213 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.788348 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.888657 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:45 crc kubenswrapper[4817]: E0320 12:28:45.989437 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.090304 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.191503 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.292697 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.393142 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.493996 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.595193 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.696305 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.796557 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.897350 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:46 crc kubenswrapper[4817]: E0320 12:28:46.997652 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.098501 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.199428 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.300259 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.400707 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.501149 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.601582 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.702200 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.802536 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:47 crc kubenswrapper[4817]: E0320 12:28:47.903970 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.004069 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.104331 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.205353 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.306269 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.407315 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.507869 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.608287 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.709028 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.741897 4817 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.809216 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:48 crc kubenswrapper[4817]: E0320 12:28:48.909582 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.010516 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.111001 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.137236 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.141329 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.141366 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.141376 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.141392 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.141403 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:49Z","lastTransitionTime":"2026-03-20T12:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.153835 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.159232 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.159267 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.159279 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.159295 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.159307 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:49Z","lastTransitionTime":"2026-03-20T12:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.174395 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.179019 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.179064 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.179076 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.179097 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.179111 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:49Z","lastTransitionTime":"2026-03-20T12:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.192928 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.200050 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.200103 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.200137 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.200156 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:49 crc kubenswrapper[4817]: I0320 12:28:49.200167 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:49Z","lastTransitionTime":"2026-03-20T12:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.210568 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1ed2f349-5c70-4bd5-a6f5-330128fd6277\\\",\\\"systemUUID\\\":\\\"8b6acc6f-72ef-432d-a377-611bb5e5be3b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.210738 4817 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.211489 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.311964 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.412978 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.513820 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.614242 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.715081 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.815607 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:49 crc kubenswrapper[4817]: E0320 12:28:49.916384 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.017578 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.118504 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.219080 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.319342 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.420572 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.521221 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: I0320 12:28:50.558113 4817 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.622090 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.722301 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.823603 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:50 crc kubenswrapper[4817]: E0320 12:28:50.924690 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.025468 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.126290 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.226722 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.327818 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.427972 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.529176 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.629727 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.730150 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.830760 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:51 crc kubenswrapper[4817]: E0320 12:28:51.931288 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.031964 4817 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.057803 4817 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.134483 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.134529 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.134552 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.134574 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.134587 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.237544 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.237610 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.237634 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.237663 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.237684 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.341292 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.341357 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.341374 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.341401 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.341419 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.444849 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.444924 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.444941 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.444967 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.444986 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.547445 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.547497 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.547510 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.547527 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.547542 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.616211 4817 apiserver.go:52] "Watching apiserver" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.621634 4817 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.621921 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.622425 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.622558 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.622644 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.622774 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.622869 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.622893 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.623350 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.623454 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.623366 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.625190 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.625270 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.625368 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.625388 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.625832 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.626211 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.626592 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.626850 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.627024 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.649892 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.650157 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.650193 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.650463 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.650489 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.651995 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.663237 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.674399 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.687250 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.691668 4817 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.699947 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710008 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710055 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710079 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710100 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710143 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710167 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710193 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710213 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710233 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710254 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710280 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710302 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710326 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710346 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710368 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710404 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710427 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710450 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710471 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710494 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710516 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710538 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710561 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710582 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710603 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710624 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710658 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710682 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710703 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710726 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710748 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710773 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710795 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710816 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710840 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710862 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710884 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710904 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710927 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710947 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710969 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.710992 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711014 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711034 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711057 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711078 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711100 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711142 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711163 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711185 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711210 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711232 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711317 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711338 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711354 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711374 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711412 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711426 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711441 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711455 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711471 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711487 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711503 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711520 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711539 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711556 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711574 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711590 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711608 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712002 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712020 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712036 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712051 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712233 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712252 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712269 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712285 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712303 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712319 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712339 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712359 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712377 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712410 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712427 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712446 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712462 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712478 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712495 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712511 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712527 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712544 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712561 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712577 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711313 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712593 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712612 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712628 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712645 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712662 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712679 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712697 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712715 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712732 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712751 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712768 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712783 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712845 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712864 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712995 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713016 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713036 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713057 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714050 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714071 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714088 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714105 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714136 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714152 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714167 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714184 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714200 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714216 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714232 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714248 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714264 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714280 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714296 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714316 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714333 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714349 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714365 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714381 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714402 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714417 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714434 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714491 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714507 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714523 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714543 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714577 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714593 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714610 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714627 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714643 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714663 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714679 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714695 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714711 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714727 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714744 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714759 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714776 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714861 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715135 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715154 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715171 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715188 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715205 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715222 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715238 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715254 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715270 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715289 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715306 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715323 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715340 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715360 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715378 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715411 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715428 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715445 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715462 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715478 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715498 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715514 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715530 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715546 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715562 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715578 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715602 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715618 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715636 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715702 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715753 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715771 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715788 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715805 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715822 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715839 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715858 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715877 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715893 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715931 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715953 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715973 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715993 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716011 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716032 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716053 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716074 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716097 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716130 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716149 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716169 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716187 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716208 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716264 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716275 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716286 4817 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716784 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712589 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711431 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711760 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.711795 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712147 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712166 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712183 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712492 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712462 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712549 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712565 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712652 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712678 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712781 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.712904 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713148 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713281 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713415 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713463 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713619 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.713977 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714068 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714185 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714272 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714287 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.714314 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715168 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715285 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715253 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715348 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715467 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715590 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.715983 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716189 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716275 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716448 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716482 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.716883 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717251 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717375 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717460 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717695 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717678 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.717774 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718088 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718220 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718344 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718521 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718622 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718707 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718752 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.718780 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.719152 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.719447 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.719596 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.719790 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720285 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720363 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720463 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720697 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720624 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720796 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720881 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.720979 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721155 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721181 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721193 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721493 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721512 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.721976 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.722597 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:28:53.22228089 +0000 UTC m=+95.310593673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.722789 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.722888 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.723451 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.723497 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.723786 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.724537 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.724576 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.725007 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.725466 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.725508 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.725856 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.725878 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726081 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726190 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726201 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726460 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726523 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726531 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.726886 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727047 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727290 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727466 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729010 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727746 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.728108 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.728304 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.727736 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729161 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.728468 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.728924 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.728971 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729203 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729249 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729481 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729409 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.729990 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.730182 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.730228 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.730517 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.730555 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.730894 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.730960 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:53.230737556 +0000 UTC m=+95.319050359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733554 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733670 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733865 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733955 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.734104 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.734364 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.734706 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.735157 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.735663 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.735693 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.735780 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.735903 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.736021 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.736206 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.736654 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.736900 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.730441 4817 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.737941 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.738038 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.738410 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.738289 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.738585 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.731456 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.731605 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.739041 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.731776 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.731962 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.732031 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.739965 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.739468 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.739985 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.731174 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.732634 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.732752 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733033 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.733360 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.738300 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.739782 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.732590 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.740399 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.740657 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.740833 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:53.240806766 +0000 UTC m=+95.329119609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.742303 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.742524 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.742558 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.742574 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.742651 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:53.242635357 +0000 UTC m=+95.330948230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.743149 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.744621 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.744813 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.745934 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.745955 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.745968 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:52 crc kubenswrapper[4817]: E0320 12:28:52.746012 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:53.245996461 +0000 UTC m=+95.334309244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.746203 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.746906 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.747148 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.747648 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.747875 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.747968 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.748064 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.748386 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.748780 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.748972 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.750335 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.750541 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.751504 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.751980 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.752397 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.752751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753168 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753397 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753414 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753423 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753435 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753476 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.753823 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.754221 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.754625 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.755843 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.756605 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.756685 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.756749 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.757863 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.758076 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.758489 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.759242 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.759504 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.759547 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.759673 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.759805 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760034 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760027 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760303 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760408 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760451 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.760738 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.761019 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.761026 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.761111 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.773546 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.776556 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.784592 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.787920 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.816821 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.816869 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.816924 4817 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.816956 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.816993 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817035 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817047 4817 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817057 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817069 4817 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817079 4817 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817088 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817096 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817105 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817115 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817145 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817156 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817165 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817165 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817173 4817 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817183 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817194 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817203 4817 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817212 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817222 4817 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817231 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817240 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817249 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817258 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817266 4817 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817275 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817285 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817293 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817301 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817310 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817318 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817327 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817337 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817345 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817353 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817364 4817 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817372 4817 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817380 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817388 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817429 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817438 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817446 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817531 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817542 4817 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817551 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817559 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817568 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817596 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817605 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817612 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817620 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817628 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817636 4817 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817644 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817652 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817660 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817668 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817678 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817686 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817694 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817703 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817711 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817721 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817728 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817737 4817 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817745 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817753 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817761 4817 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817769 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817777 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817785 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817793 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817802 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817810 4817 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817818 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817827 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817836 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817844 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817852 4817 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817859 4817 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817867 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817875 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817882 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817890 4817 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817897 4817 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817905 4817 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817913 4817 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817921 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817929 4817 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817940 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817959 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817967 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817975 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817982 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817990 4817 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.817998 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818005 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818013 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818021 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818028 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818036 4817 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818045 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818060 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818068 4817 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818076 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818084 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818092 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818100 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818108 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818115 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818136 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818143 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818150 4817 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818158 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818165 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818174 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818182 4817 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818191 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818199 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818207 4817 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818214 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818222 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818229 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818237 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818244 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818252 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818260 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818267 4817 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818274 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818281 4817 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818289 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818296 4817 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818303 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818311 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818319 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818327 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818335 4817 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818342 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818350 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818357 4817 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818365 4817 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818373 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818381 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818388 4817 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818396 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818404 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818411 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818418 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818426 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818434 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818442 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818450 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818457 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818465 4817 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818472 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818480 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818487 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818495 4817 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818502 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818510 4817 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818518 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818526 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818534 4817 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818543 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818551 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818560 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818567 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818575 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818582 4817 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818589 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818598 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818605 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818613 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818620 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818627 4817 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818634 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818642 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818650 4817 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818657 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818664 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818672 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818679 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818687 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818695 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818702 4817 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818709 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818717 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818724 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818731 4817 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818739 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.818746 4817 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.855485 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.855539 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.855555 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.855577 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.855596 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.945674 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.958801 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.958960 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.959046 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.959182 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.959289 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:52Z","lastTransitionTime":"2026-03-20T12:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.963336 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 12:28:52 crc kubenswrapper[4817]: I0320 12:28:52.980376 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 12:28:53 crc kubenswrapper[4817]: W0320 12:28:53.001250 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-62ca807262f75dcaf46db5ecb4469b4f08387d98df8ad89eea84e1fbf6336f9c WatchSource:0}: Error finding container 62ca807262f75dcaf46db5ecb4469b4f08387d98df8ad89eea84e1fbf6336f9c: Status 404 returned error can't find the container with id 62ca807262f75dcaf46db5ecb4469b4f08387d98df8ad89eea84e1fbf6336f9c Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.043752 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"62ca807262f75dcaf46db5ecb4469b4f08387d98df8ad89eea84e1fbf6336f9c"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.044781 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"035561137d5a166b8074f52faefa2f6cd8d380aa7812609a37349baebb80ef15"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.045934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"494e1f607d117c324f826f8eb5379e5c881a787d45f0ea01f918f5a62b82ff67"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.061665 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.061701 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.061713 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.061729 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.061741 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.164785 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.164815 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.164823 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.164838 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.164847 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.267247 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.267297 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.267310 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.267328 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.267345 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.322974 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.323065 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.323099 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.323161 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323182 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:28:54.323155927 +0000 UTC m=+96.411468710 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323220 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323197 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.323248 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323256 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323299 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323275 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:54.323258429 +0000 UTC m=+96.411571282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323353 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323690 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323713 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323726 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323369 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:54.323355142 +0000 UTC m=+96.411667995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323874 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:54.323850206 +0000 UTC m=+96.412162989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.323991 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:54.323889447 +0000 UTC m=+96.412202230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.370176 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.370238 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.370248 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.370263 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.370271 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.472888 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.472939 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.472952 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.473173 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.473186 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.575590 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.575840 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.575920 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.576027 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.576141 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.663292 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:53 crc kubenswrapper[4817]: E0320 12:28:53.663490 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.679164 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.679200 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.679208 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.679220 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.679229 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.781377 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.781446 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.781470 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.781500 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.781536 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.883897 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.883933 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.883943 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.883959 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.883975 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.986404 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.986431 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.986438 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.986451 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:53 crc kubenswrapper[4817]: I0320 12:28:53.986459 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:53Z","lastTransitionTime":"2026-03-20T12:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.049824 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fdf4227d7cf4547ae11c2a23b384d704759bf76db25e9fa54bdcd38a7f5752c"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.049876 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"491fa84270b85783f875a2c4127920e8174872ed8ab7f49c549e8d5e05c1a5bc"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.051382 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7e2dfbadfffc14e1510e5b7cb8d5d88037691ab1566bbe419dbb0e75e1af575b"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.064620 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.077135 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.088992 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.089028 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.089040 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.089054 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.089064 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.096698 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.109257 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.123783 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.140853 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fdf4227d7cf4547ae11c2a23b384d704759bf76db25e9fa54bdcd38a7f5752c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T12:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491fa84270b85783f875a2c4127920e8174872ed8ab7f49c549e8d5e05c1a5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T12:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.155833 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.170492 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.182900 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.191676 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.191713 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.191728 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.191747 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.191760 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.195349 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.209809 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fdf4227d7cf4547ae11c2a23b384d704759bf76db25e9fa54bdcd38a7f5752c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T12:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491fa84270b85783f875a2c4127920e8174872ed8ab7f49c549e8d5e05c1a5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T12:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.226181 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T12:28:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e2dfbadfffc14e1510e5b7cb8d5d88037691ab1566bbe419dbb0e75e1af575b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T12:28:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T12:28:54Z is after 2025-08-24T17:21:41Z" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.294231 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.294273 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.294285 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.294303 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.294315 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.332843 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.332933 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.332971 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333040 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.333011353 +0000 UTC m=+98.421324136 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333077 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333093 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333104 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.333113 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.333174 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333181 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.333168458 +0000 UTC m=+98.421481241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333188 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333241 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333257 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333267 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333305 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.333294761 +0000 UTC m=+98.421607644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333344 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.333324832 +0000 UTC m=+98.421637625 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333391 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.333443 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.333430075 +0000 UTC m=+98.421742858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.397129 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.397170 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.397180 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.397194 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.397205 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.498830 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.498900 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.498924 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.498960 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.499001 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.601536 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.601574 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.601586 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.601600 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.601610 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.662955 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.663105 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.663343 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.663535 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.666748 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.667551 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.668637 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.669215 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.670282 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.670832 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.671542 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.672585 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.673216 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.674066 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.674569 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.675551 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.676037 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.676565 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.677475 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.677950 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.679016 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.679385 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.679907 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.680860 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.681408 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.682309 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.682827 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.683869 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.684379 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.684957 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.685992 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.686620 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.687541 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.688007 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.688962 4817 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.689082 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.690947 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.692045 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.692595 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.694394 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.695321 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.696351 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.697041 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.698215 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.698765 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.700341 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.701010 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.702088 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.702793 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.703891 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704261 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704300 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704320 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704348 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704370 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.704669 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.706225 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.706814 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.707886 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.708505 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.709466 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.709660 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: E0320 12:28:54.709757 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.710524 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.711100 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.712073 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.806885 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.807352 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.807430 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.807500 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.807571 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.914069 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.914466 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.914561 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.914664 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:54 crc kubenswrapper[4817]: I0320 12:28:54.914783 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:54Z","lastTransitionTime":"2026-03-20T12:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.018227 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.018317 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.018335 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.018360 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.018377 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.054854 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:28:55 crc kubenswrapper[4817]: E0320 12:28:55.055015 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.121028 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.121069 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.121078 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.121093 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.121103 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.211964 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8447k"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.212706 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.216471 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.218845 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.219077 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.223790 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.223878 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.223908 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.223945 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.223971 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.226362 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8wxvf"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.227014 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.228689 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dch6v"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.229300 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-65mlb"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.229840 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.230648 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.230921 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.231033 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.232014 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.232063 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.232483 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.232754 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.233293 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.234012 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.238529 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.239057 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.239379 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.239566 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.240484 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-hosts-file\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.240542 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fnz\" (UniqueName: \"kubernetes.io/projected/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-kube-api-access-99fnz\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.257152 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gvrbl"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.257903 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261070 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261163 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261270 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261429 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261537 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.261811 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.263252 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.326673 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.326741 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.326759 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.326785 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.326808 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.341783 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-bin\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342102 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-hostroot\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342212 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-env-overrides\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-script-lib\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342401 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-os-release\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342483 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8b7e138-8c64-47fb-84b7-4a42e612947d-proxy-tls\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342560 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-netd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342640 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-daemon-config\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-node-log\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-k8s-cni-cncf-io\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342873 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.342951 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c8b7e138-8c64-47fb-84b7-4a42e612947d-rootfs\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343049 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-socket-dir-parent\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343255 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-cnibin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343356 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-kubelet\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343409 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-etc-kubernetes\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343466 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-systemd-units\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343658 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-slash\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343802 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-etc-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.343904 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344000 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-multus\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344169 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ddj\" (UniqueName: \"kubernetes.io/projected/8fa4d382-e164-4f94-b782-a7dd98fa4460-kube-api-access-b5ddj\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344305 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-conf-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344394 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344484 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-kubelet\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344566 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-systemd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344658 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-var-lib-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344769 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344858 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-system-cni-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.344948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtx9v\" (UniqueName: \"kubernetes.io/projected/c8b7e138-8c64-47fb-84b7-4a42e612947d-kube-api-access-qtx9v\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345045 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345188 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fnz\" (UniqueName: \"kubernetes.io/projected/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-kube-api-access-99fnz\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345287 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-netns\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345565 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-os-release\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345591 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-cni-binary-copy\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345621 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-binary-copy\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345640 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-netns\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345656 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-ovn\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345736 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovn-node-metrics-cert\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345783 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnmm\" (UniqueName: \"kubernetes.io/projected/29919e2e-77b2-4461-ba7a-24a733c3f9d1-kube-api-access-9mnmm\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345813 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345847 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-system-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345872 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-bin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345901 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-cnibin\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345929 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-multus-certs\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345955 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pdm\" (UniqueName: \"kubernetes.io/projected/0417d3f1-c76c-48a5-8d34-e2211a86e098-kube-api-access-v6pdm\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.345978 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-log-socket\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.346001 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-config\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.346030 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8b7e138-8c64-47fb-84b7-4a42e612947d-mcd-auth-proxy-config\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.346064 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-hosts-file\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.346223 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-hosts-file\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.374930 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fnz\" (UniqueName: \"kubernetes.io/projected/f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26-kube-api-access-99fnz\") pod \"node-resolver-8447k\" (UID: \"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26\") " pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.430299 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.430361 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.430379 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.430402 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.430419 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447058 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-kubelet\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447168 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-systemd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447209 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-var-lib-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447249 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447258 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-kubelet\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447282 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447312 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-systemd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447322 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-system-cni-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447354 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447370 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-system-cni-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447376 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx9v\" (UniqueName: \"kubernetes.io/projected/c8b7e138-8c64-47fb-84b7-4a42e612947d-kube-api-access-qtx9v\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447414 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-var-lib-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447438 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-netns\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447480 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-os-release\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447517 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-cni-binary-copy\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447585 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-binary-copy\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-netns\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447643 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447655 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-ovn\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447698 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447740 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovn-node-metrics-cert\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447783 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnmm\" (UniqueName: \"kubernetes.io/projected/29919e2e-77b2-4461-ba7a-24a733c3f9d1-kube-api-access-9mnmm\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447814 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-os-release\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447823 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-cnibin\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447704 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-netns\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447867 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-system-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447898 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-bin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447912 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-run-ovn\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447964 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-log-socket\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447910 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447977 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-cnibin\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447987 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-system-cni-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447922 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-log-socket\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.447744 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-netns\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448166 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-config\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448217 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-multus-certs\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448251 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6pdm\" (UniqueName: \"kubernetes.io/projected/0417d3f1-c76c-48a5-8d34-e2211a86e098-kube-api-access-v6pdm\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448291 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8b7e138-8c64-47fb-84b7-4a42e612947d-mcd-auth-proxy-config\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448329 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-bin\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448342 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-multus-certs\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448361 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-hostroot\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448015 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-bin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448403 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-env-overrides\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448454 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-script-lib\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448487 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-cni-binary-copy\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448502 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8b7e138-8c64-47fb-84b7-4a42e612947d-proxy-tls\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448557 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-netd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448601 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-os-release\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448643 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-k8s-cni-cncf-io\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448679 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-daemon-config\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448714 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-node-log\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448752 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-socket-dir-parent\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448786 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448819 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c8b7e138-8c64-47fb-84b7-4a42e612947d-rootfs\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448850 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-etc-kubernetes\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448881 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-systemd-units\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448914 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-slash\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448946 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-etc-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.448982 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-cnibin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449003 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-binary-copy\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449025 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-kubelet\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449073 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449103 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-config\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449111 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-multus\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449158 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-bin\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449185 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5ddj\" (UniqueName: \"kubernetes.io/projected/8fa4d382-e164-4f94-b782-a7dd98fa4460-kube-api-access-b5ddj\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449209 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c8b7e138-8c64-47fb-84b7-4a42e612947d-rootfs\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449222 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449265 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovnkube-script-lib\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449291 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-conf-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449409 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-conf-dir\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-etc-kubernetes\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449493 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8b7e138-8c64-47fb-84b7-4a42e612947d-mcd-auth-proxy-config\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449519 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-systemd-units\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449101 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-hostroot\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449588 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-socket-dir-parent\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449581 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-os-release\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449696 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-slash\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449697 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29919e2e-77b2-4461-ba7a-24a733c3f9d1-env-overrides\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449728 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-kubelet\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449805 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-cnibin\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449812 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-run-k8s-cni-cncf-io\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449763 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0417d3f1-c76c-48a5-8d34-e2211a86e098-multus-daemon-config\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449783 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0417d3f1-c76c-48a5-8d34-e2211a86e098-host-var-lib-cni-multus\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449785 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-etc-openvswitch\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449788 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449726 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-host-cni-netd\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.449755 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29919e2e-77b2-4461-ba7a-24a733c3f9d1-node-log\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.450832 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fa4d382-e164-4f94-b782-a7dd98fa4460-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.453393 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8b7e138-8c64-47fb-84b7-4a42e612947d-proxy-tls\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.454691 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29919e2e-77b2-4461-ba7a-24a733c3f9d1-ovn-node-metrics-cert\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.462386 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fa4d382-e164-4f94-b782-a7dd98fa4460-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.467344 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx9v\" (UniqueName: \"kubernetes.io/projected/c8b7e138-8c64-47fb-84b7-4a42e612947d-kube-api-access-qtx9v\") pod \"machine-config-daemon-dch6v\" (UID: \"c8b7e138-8c64-47fb-84b7-4a42e612947d\") " pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.469738 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6pdm\" (UniqueName: \"kubernetes.io/projected/0417d3f1-c76c-48a5-8d34-e2211a86e098-kube-api-access-v6pdm\") pod \"multus-8wxvf\" (UID: \"0417d3f1-c76c-48a5-8d34-e2211a86e098\") " pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.477549 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnmm\" (UniqueName: \"kubernetes.io/projected/29919e2e-77b2-4461-ba7a-24a733c3f9d1-kube-api-access-9mnmm\") pod \"ovnkube-node-gvrbl\" (UID: \"29919e2e-77b2-4461-ba7a-24a733c3f9d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.485923 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5ddj\" (UniqueName: \"kubernetes.io/projected/8fa4d382-e164-4f94-b782-a7dd98fa4460-kube-api-access-b5ddj\") pod \"multus-additional-cni-plugins-65mlb\" (UID: \"8fa4d382-e164-4f94-b782-a7dd98fa4460\") " pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.492706 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tfqp8"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.493279 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.498656 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.498856 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.498987 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.499259 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.532970 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.533308 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.533434 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.533572 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.533702 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.537117 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8447k" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.550402 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbaeb605-79ea-4225-8fe8-0bb01317003a-host\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.550582 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbaeb605-79ea-4225-8fe8-0bb01317003a-serviceca\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.550673 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vjl\" (UniqueName: \"kubernetes.io/projected/bbaeb605-79ea-4225-8fe8-0bb01317003a-kube-api-access-46vjl\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.551037 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5cb1c98_1ff9_45d7_ad4a_1f98753fcb26.slice/crio-ba0f33f41119de9105f6b70ebccd9c4de9f82185fc3e0e8e440b0eaf0a6bc7ed WatchSource:0}: Error finding container ba0f33f41119de9105f6b70ebccd9c4de9f82185fc3e0e8e440b0eaf0a6bc7ed: Status 404 returned error can't find the container with id ba0f33f41119de9105f6b70ebccd9c4de9f82185fc3e0e8e440b0eaf0a6bc7ed Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.559488 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8wxvf" Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.572677 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0417d3f1_c76c_48a5_8d34_e2211a86e098.slice/crio-d851a34fb1c4d8d2aa18ed53d21d38398b35ff107fa5b7a840b82f2e0ff7fa57 WatchSource:0}: Error finding container d851a34fb1c4d8d2aa18ed53d21d38398b35ff107fa5b7a840b82f2e0ff7fa57: Status 404 returned error can't find the container with id d851a34fb1c4d8d2aa18ed53d21d38398b35ff107fa5b7a840b82f2e0ff7fa57 Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.573641 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.585960 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65mlb" Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.588692 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b7e138_8c64_47fb_84b7_4a42e612947d.slice/crio-a336042aa4c00c541a1d4d92c3992fa4a4fc70865ce18d88dcae2e4f6708e7cd WatchSource:0}: Error finding container a336042aa4c00c541a1d4d92c3992fa4a4fc70865ce18d88dcae2e4f6708e7cd: Status 404 returned error can't find the container with id a336042aa4c00c541a1d4d92c3992fa4a4fc70865ce18d88dcae2e4f6708e7cd Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.599223 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.605880 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fa4d382_e164_4f94_b782_a7dd98fa4460.slice/crio-30e7519dfb31f4c9f749aed0d9394bc64e1ea441e7a6af5782943c13585f5960 WatchSource:0}: Error finding container 30e7519dfb31f4c9f749aed0d9394bc64e1ea441e7a6af5782943c13585f5960: Status 404 returned error can't find the container with id 30e7519dfb31f4c9f749aed0d9394bc64e1ea441e7a6af5782943c13585f5960 Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.619947 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29919e2e_77b2_4461_ba7a_24a733c3f9d1.slice/crio-a52c1ea3ee7b09b07d31cc52a22cb745983839084e6254693baf15b77aa9a18b WatchSource:0}: Error finding container a52c1ea3ee7b09b07d31cc52a22cb745983839084e6254693baf15b77aa9a18b: Status 404 returned error can't find the container with id a52c1ea3ee7b09b07d31cc52a22cb745983839084e6254693baf15b77aa9a18b Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.637352 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.637396 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.637409 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.637428 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.637440 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.651463 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vjl\" (UniqueName: \"kubernetes.io/projected/bbaeb605-79ea-4225-8fe8-0bb01317003a-kube-api-access-46vjl\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.651539 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbaeb605-79ea-4225-8fe8-0bb01317003a-host\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.651554 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbaeb605-79ea-4225-8fe8-0bb01317003a-serviceca\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.651981 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbaeb605-79ea-4225-8fe8-0bb01317003a-host\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.652683 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbaeb605-79ea-4225-8fe8-0bb01317003a-serviceca\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.663369 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:55 crc kubenswrapper[4817]: E0320 12:28:55.663881 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.695235 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.695653 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.701595 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.701616 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.716930 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vjl\" (UniqueName: \"kubernetes.io/projected/bbaeb605-79ea-4225-8fe8-0bb01317003a-kube-api-access-46vjl\") pod \"node-ca-tfqp8\" (UID: \"bbaeb605-79ea-4225-8fe8-0bb01317003a\") " pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.726624 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xq7wp"] Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.727837 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:55 crc kubenswrapper[4817]: E0320 12:28:55.727950 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.741761 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.741851 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.741893 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.741922 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.741937 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.752183 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.752286 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/926977c3-81d7-4b32-a585-79578717ac7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.752322 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zt9k\" (UniqueName: \"kubernetes.io/projected/926977c3-81d7-4b32-a585-79578717ac7c-kube-api-access-9zt9k\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.752351 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.807002 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tfqp8" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.844855 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.844918 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.844960 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.844977 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.844987 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853571 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62jd\" (UniqueName: \"kubernetes.io/projected/c9cb3896-c4ff-4ccb-b494-eac8b4460342-kube-api-access-w62jd\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853662 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/926977c3-81d7-4b32-a585-79578717ac7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853701 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zt9k\" (UniqueName: \"kubernetes.io/projected/926977c3-81d7-4b32-a585-79578717ac7c-kube-api-access-9zt9k\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853724 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.853757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.854585 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.855348 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/926977c3-81d7-4b32-a585-79578717ac7c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.862580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/926977c3-81d7-4b32-a585-79578717ac7c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.870584 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zt9k\" (UniqueName: \"kubernetes.io/projected/926977c3-81d7-4b32-a585-79578717ac7c-kube-api-access-9zt9k\") pod \"ovnkube-control-plane-749d76644c-bhk24\" (UID: \"926977c3-81d7-4b32-a585-79578717ac7c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:55 crc kubenswrapper[4817]: W0320 12:28:55.870979 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbaeb605_79ea_4225_8fe8_0bb01317003a.slice/crio-4d0c69f8c2101ea1a1a1b03b6cffb516c73ca5ab839248bdbc2ca4d16721b122 WatchSource:0}: Error finding container 4d0c69f8c2101ea1a1a1b03b6cffb516c73ca5ab839248bdbc2ca4d16721b122: Status 404 returned error can't find the container with id 4d0c69f8c2101ea1a1a1b03b6cffb516c73ca5ab839248bdbc2ca4d16721b122 Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.947389 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.947442 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.947457 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.947486 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.947538 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:55Z","lastTransitionTime":"2026-03-20T12:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.954378 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.954422 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62jd\" (UniqueName: \"kubernetes.io/projected/c9cb3896-c4ff-4ccb-b494-eac8b4460342-kube-api-access-w62jd\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:55 crc kubenswrapper[4817]: E0320 12:28:55.954557 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:55 crc kubenswrapper[4817]: E0320 12:28:55.954641 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs podName:c9cb3896-c4ff-4ccb-b494-eac8b4460342 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:56.45462058 +0000 UTC m=+98.542933363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs") pod "network-metrics-daemon-xq7wp" (UID: "c9cb3896-c4ff-4ccb-b494-eac8b4460342") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:55 crc kubenswrapper[4817]: I0320 12:28:55.975256 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62jd\" (UniqueName: \"kubernetes.io/projected/c9cb3896-c4ff-4ccb-b494-eac8b4460342-kube-api-access-w62jd\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.050079 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.050200 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.050228 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.050264 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.050290 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.060187 4817 generic.go:334] "Generic (PLEG): container finished" podID="29919e2e-77b2-4461-ba7a-24a733c3f9d1" containerID="98d290d0c3dc6663b6f7659c961ab100905704c2a4ed3ab493044eff470bab62" exitCode=0 Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.060266 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerDied","Data":"98d290d0c3dc6663b6f7659c961ab100905704c2a4ed3ab493044eff470bab62"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.060299 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"a52c1ea3ee7b09b07d31cc52a22cb745983839084e6254693baf15b77aa9a18b"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.062840 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8wxvf" event={"ID":"0417d3f1-c76c-48a5-8d34-e2211a86e098","Type":"ContainerStarted","Data":"120055e7e3a92cd394ab960071ddf11ca2f1553bd3354ec176c300f6b2758c05"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.062879 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8wxvf" event={"ID":"0417d3f1-c76c-48a5-8d34-e2211a86e098","Type":"ContainerStarted","Data":"d851a34fb1c4d8d2aa18ed53d21d38398b35ff107fa5b7a840b82f2e0ff7fa57"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.067792 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerStarted","Data":"faa11d0bf73cc4879ce3c9c2e49f4a8a90c520fb41949ddb834e616a9ab5e1e8"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.067826 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerStarted","Data":"bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.067841 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerStarted","Data":"a336042aa4c00c541a1d4d92c3992fa4a4fc70865ce18d88dcae2e4f6708e7cd"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.070311 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tfqp8" event={"ID":"bbaeb605-79ea-4225-8fe8-0bb01317003a","Type":"ContainerStarted","Data":"4d0c69f8c2101ea1a1a1b03b6cffb516c73ca5ab839248bdbc2ca4d16721b122"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.072710 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.073611 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"246a0d5d0516349be7f61afcac5bacf127c1be9d5b534f365b52a0e68775010a"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.077694 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerStarted","Data":"6cd3ae60869fc7b3e9434b5513524e4257984549384163cc2ce8481abc1dcfc0"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.077731 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerStarted","Data":"30e7519dfb31f4c9f749aed0d9394bc64e1ea441e7a6af5782943c13585f5960"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.082770 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8447k" event={"ID":"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26","Type":"ContainerStarted","Data":"8adfadf3b7d8a72c3228ca9d3d3589e9917eff1db9046e7eb6533124241393ae"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.082797 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8447k" event={"ID":"f5cb1c98-1ff9-45d7-ad4a-1f98753fcb26","Type":"ContainerStarted","Data":"ba0f33f41119de9105f6b70ebccd9c4de9f82185fc3e0e8e440b0eaf0a6bc7ed"} Mar 20 12:28:56 crc kubenswrapper[4817]: W0320 12:28:56.103438 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod926977c3_81d7_4b32_a585_79578717ac7c.slice/crio-042c09116b784d773067cda63883034494d81d512b9be8108c5f8428029a3ff5 WatchSource:0}: Error finding container 042c09116b784d773067cda63883034494d81d512b9be8108c5f8428029a3ff5: Status 404 returned error can't find the container with id 042c09116b784d773067cda63883034494d81d512b9be8108c5f8428029a3ff5 Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.118156 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8447k" podStartSLOduration=59.118138398 podStartE2EDuration="59.118138398s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:28:56.117275744 +0000 UTC m=+98.205588547" watchObservedRunningTime="2026-03-20 12:28:56.118138398 +0000 UTC m=+98.206451171" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.153578 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.153625 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.153643 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.153664 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.153682 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.169513 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podStartSLOduration=59.169490159 podStartE2EDuration="59.169490159s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:28:56.135243845 +0000 UTC m=+98.223556638" watchObservedRunningTime="2026-03-20 12:28:56.169490159 +0000 UTC m=+98.257802952" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.197320 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8wxvf" podStartSLOduration=58.197293754 podStartE2EDuration="58.197293754s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:28:56.197109109 +0000 UTC m=+98.285421892" watchObservedRunningTime="2026-03-20 12:28:56.197293754 +0000 UTC m=+98.285606537" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.257408 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.257454 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.257465 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.257486 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.257497 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.358565 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.358791 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:00.358739084 +0000 UTC m=+102.447051997 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.358891 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.358970 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.359046 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.359098 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359145 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359253 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359152 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359254 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:00.359223537 +0000 UTC m=+102.447536350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359281 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359372 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:00.359353941 +0000 UTC m=+102.447666734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359385 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359459 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:00.359432253 +0000 UTC m=+102.447745036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359539 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359572 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359593 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.359672 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:00.359655799 +0000 UTC m=+102.447968762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.361013 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.361056 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.361069 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.361092 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.361107 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.460275 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.460545 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.460667 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs podName:c9cb3896-c4ff-4ccb-b494-eac8b4460342 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:57.460647694 +0000 UTC m=+99.548960477 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs") pod "network-metrics-daemon-xq7wp" (UID: "c9cb3896-c4ff-4ccb-b494-eac8b4460342") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.464874 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.464934 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.464951 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.464975 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.464999 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.567050 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.567095 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.567106 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.567139 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.567151 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.662437 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.662605 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.662644 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:56 crc kubenswrapper[4817]: E0320 12:28:56.662809 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.668717 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.668754 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.668764 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.668780 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.668790 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.771682 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.771733 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.771745 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.771770 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.771786 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.873974 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.874308 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.874318 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.874335 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.874345 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.976382 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.976417 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.976427 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.976441 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:56 crc kubenswrapper[4817]: I0320 12:28:56.976451 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:56Z","lastTransitionTime":"2026-03-20T12:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.079558 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.079608 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.079620 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.079636 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.079651 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105680 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"ef5a0f728e46a1e22f0874a62e06e0a14cbcc7143c81dca22a4f42a20108fa7c"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105747 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"3e86d8c8464b076e1480d1627e8287a0d22a70be253e0692bf74b9745bf22778"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105769 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"8964eeeb6a825b3b3e8990773875f492a16702407dcffc3d9149efeb26fc69ae"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"7ee4c653828617151a896f8d2baf7ec67c0dbdf7867e561cde1fc95f82b345fd"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105802 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"032f2ea931e15a0dd79efc9483edd260b436b870066e128a529668b3e43bb2ab"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.105818 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"d598729c3c3f84237e200d5ea9961f380aa8c5da236beccad4a3ca67a693af39"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.108359 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tfqp8" event={"ID":"bbaeb605-79ea-4225-8fe8-0bb01317003a","Type":"ContainerStarted","Data":"33e7d7f40862714b218e7528ec25cabcb14f0904e24fc0b9689024baf8d975f2"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.114941 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="6cd3ae60869fc7b3e9434b5513524e4257984549384163cc2ce8481abc1dcfc0" exitCode=0 Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.115018 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"6cd3ae60869fc7b3e9434b5513524e4257984549384163cc2ce8481abc1dcfc0"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.117864 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" event={"ID":"926977c3-81d7-4b32-a585-79578717ac7c","Type":"ContainerStarted","Data":"21e3e65b0b82ae9dd23308c37bc712cc5fff57e54993f8a60bcc46b4f6367e99"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.117957 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" event={"ID":"926977c3-81d7-4b32-a585-79578717ac7c","Type":"ContainerStarted","Data":"7f0f3a5e3c53c73ddbf08cb12958f699b576fb2c89d2b3e455fee512aa25fd0b"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.117986 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" event={"ID":"926977c3-81d7-4b32-a585-79578717ac7c","Type":"ContainerStarted","Data":"042c09116b784d773067cda63883034494d81d512b9be8108c5f8428029a3ff5"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.128446 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tfqp8" podStartSLOduration=60.128381264 podStartE2EDuration="1m0.128381264s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:28:57.12823165 +0000 UTC m=+99.216544473" watchObservedRunningTime="2026-03-20 12:28:57.128381264 +0000 UTC m=+99.216694087" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.164447 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhk24" podStartSLOduration=59.164396758 podStartE2EDuration="59.164396758s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:28:57.164030818 +0000 UTC m=+99.252343641" watchObservedRunningTime="2026-03-20 12:28:57.164396758 +0000 UTC m=+99.252709551" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.182178 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.182239 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.182258 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.182282 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.182300 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.285820 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.285858 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.285872 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.285890 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.285905 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.389608 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.389927 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.389944 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.389966 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.389989 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.471877 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:57 crc kubenswrapper[4817]: E0320 12:28:57.472023 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:57 crc kubenswrapper[4817]: E0320 12:28:57.472071 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs podName:c9cb3896-c4ff-4ccb-b494-eac8b4460342 nodeName:}" failed. No retries permitted until 2026-03-20 12:28:59.472058493 +0000 UTC m=+101.560371266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs") pod "network-metrics-daemon-xq7wp" (UID: "c9cb3896-c4ff-4ccb-b494-eac8b4460342") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.492601 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.492641 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.492652 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.492698 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.492711 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.595425 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.595463 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.595474 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.595490 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.595500 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.662658 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.662739 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:57 crc kubenswrapper[4817]: E0320 12:28:57.662832 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:28:57 crc kubenswrapper[4817]: E0320 12:28:57.662998 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.698616 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.698667 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.698685 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.698704 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.698721 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.800406 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.800459 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.800477 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.800499 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.800517 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.903023 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.903059 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.903072 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.903088 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:57 crc kubenswrapper[4817]: I0320 12:28:57.903100 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:57Z","lastTransitionTime":"2026-03-20T12:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.006088 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.006189 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.006212 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.006241 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.006265 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.109003 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.109048 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.109061 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.109079 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.109091 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.123565 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="8cc75edd13181076631148ceecc4a9d7a86ab8ac94d0b0ac6c0357366274b864" exitCode=0 Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.123657 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"8cc75edd13181076631148ceecc4a9d7a86ab8ac94d0b0ac6c0357366274b864"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.211581 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.211629 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.211640 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.211658 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.211671 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.314077 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.314157 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.314170 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.314188 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.314199 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.416998 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.417369 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.417380 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.417393 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.417404 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.520581 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.520619 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.520632 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.520647 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.520658 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.623334 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.623383 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.623407 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.623431 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.623445 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.663117 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.663252 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:28:58 crc kubenswrapper[4817]: E0320 12:28:58.664859 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:28:58 crc kubenswrapper[4817]: E0320 12:28:58.665017 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.726323 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.726679 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.726949 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.727202 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.727357 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.829901 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.829937 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.829945 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.829958 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.829967 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.932878 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.932915 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.932923 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.932936 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:58 crc kubenswrapper[4817]: I0320 12:28:58.932946 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:58Z","lastTransitionTime":"2026-03-20T12:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.036651 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.036718 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.036735 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.036760 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.036780 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:59Z","lastTransitionTime":"2026-03-20T12:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.129664 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="089c867d8034ad9cab40db902fbbe997bfc6cacfc842c9c99cb35e7ac533341d" exitCode=0 Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.129740 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"089c867d8034ad9cab40db902fbbe997bfc6cacfc842c9c99cb35e7ac533341d"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.135021 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"d48596ad834a850954fa65fd3766c3ca2fc5b827065a520bd29b81e8885886a0"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.139525 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.139576 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.139590 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.139613 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.139632 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:59Z","lastTransitionTime":"2026-03-20T12:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.242168 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.242202 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.242213 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.242228 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.242239 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:59Z","lastTransitionTime":"2026-03-20T12:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.257636 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.257666 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.257676 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.257691 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.257703 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T12:28:59Z","lastTransitionTime":"2026-03-20T12:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.357854 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb"] Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.358237 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: W0320 12:28:59.359604 4817 reflector.go:561] object-"openshift-cluster-version"/"default-dockercfg-gxtc4": failed to list *v1.Secret: secrets "default-dockercfg-gxtc4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.359641 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"default-dockercfg-gxtc4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-gxtc4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 12:28:59 crc kubenswrapper[4817]: W0320 12:28:59.359763 4817 reflector.go:561] object-"openshift-cluster-version"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.359808 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.362141 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.363267 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501285 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501726 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501784 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501813 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501842 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.501864 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.501982 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.502023 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs podName:c9cb3896-c4ff-4ccb-b494-eac8b4460342 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:03.502010342 +0000 UTC m=+105.590323125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs") pod "network-metrics-daemon-xq7wp" (UID: "c9cb3896-c4ff-4ccb-b494-eac8b4460342") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.602811 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.602893 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.602931 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.602985 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.603038 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.603190 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.603266 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.603933 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.626642 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.662564 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.662670 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.662729 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.662812 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:28:59 crc kubenswrapper[4817]: E0320 12:28:59.662904 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:28:59 crc kubenswrapper[4817]: I0320 12:28:59.675026 4817 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.148576 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="0b1547ab23ceed160ec9202ba0398220cdbbdfe4314730b7fe7903e9a5d4d0f3" exitCode=0 Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.148618 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"0b1547ab23ceed160ec9202ba0398220cdbbdfe4314730b7fe7903e9a5d4d0f3"} Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.415686 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.415846 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.415939 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.415869602 +0000 UTC m=+110.504182395 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.415984 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416013 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.416024 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416076 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.416057537 +0000 UTC m=+110.504370360 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.416185 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416239 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416362 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.416332605 +0000 UTC m=+110.504645418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416367 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416411 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416438 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416517 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.416496249 +0000 UTC m=+110.504809162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416608 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416630 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416644 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.416741 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.416725596 +0000 UTC m=+110.505038489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.626365 4817 projected.go:288] Couldn't get configMap openshift-cluster-version/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.626439 4817 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb: failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.626538 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access podName:2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:01.126507823 +0000 UTC m=+103.214820636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access") pod "cluster-version-operator-5c965bbfc6-4nzjb" (UID: "2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6") : failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.663181 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.663206 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.663367 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:29:00 crc kubenswrapper[4817]: E0320 12:29:00.663499 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.699447 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 12:29:00 crc kubenswrapper[4817]: I0320 12:29:00.955014 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.158955 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerStarted","Data":"7ef6aee74bc867e73e98fce2fa9c2d1900403d14d323749667513206c619346f"} Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.226361 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.237604 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4nzjb\" (UID: \"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.477049 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.663031 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.663093 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:01 crc kubenswrapper[4817]: E0320 12:29:01.663189 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:29:01 crc kubenswrapper[4817]: E0320 12:29:01.663451 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:29:01 crc kubenswrapper[4817]: I0320 12:29:01.687346 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.165278 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="7ef6aee74bc867e73e98fce2fa9c2d1900403d14d323749667513206c619346f" exitCode=0 Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.165333 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"7ef6aee74bc867e73e98fce2fa9c2d1900403d14d323749667513206c619346f"} Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.168354 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" event={"ID":"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6","Type":"ContainerStarted","Data":"6d45a62c54d3414a90dd8e74affb92261480e26b21ccaea10cd5bbfedae6243e"} Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.168379 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" event={"ID":"2e1bbaf1-f5f9-4a6e-a1dc-d518affbd5f6","Type":"ContainerStarted","Data":"67207c1a6a6bd8724d4390507b9fe6648e178cacc4d52a83b9555a71cddea798"} Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.174638 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" event={"ID":"29919e2e-77b2-4461-ba7a-24a733c3f9d1","Type":"ContainerStarted","Data":"d7bfb0452846c368e77cc2fa010fb2473e2d8b9937697bdfed5cd3643d02382a"} Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.209514 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.209484023 podStartE2EDuration="1.209484023s" podCreationTimestamp="2026-03-20 12:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:02.205214014 +0000 UTC m=+104.293526827" watchObservedRunningTime="2026-03-20 12:29:02.209484023 +0000 UTC m=+104.297796856" Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.288571 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" podStartSLOduration=64.288553057 podStartE2EDuration="1m4.288553057s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:02.272395897 +0000 UTC m=+104.360708700" watchObservedRunningTime="2026-03-20 12:29:02.288553057 +0000 UTC m=+104.376865830" Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.663384 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:02 crc kubenswrapper[4817]: I0320 12:29:02.663464 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:02 crc kubenswrapper[4817]: E0320 12:29:02.663542 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:29:02 crc kubenswrapper[4817]: E0320 12:29:02.663672 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.183887 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fa4d382-e164-4f94-b782-a7dd98fa4460" containerID="a0d7811bd98ece6f92c3ba0e20f7406c59f29c2e621c0879c9fecc1b55a7dd94" exitCode=0 Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.183974 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerDied","Data":"a0d7811bd98ece6f92c3ba0e20f7406c59f29c2e621c0879c9fecc1b55a7dd94"} Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.185007 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.185051 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.185072 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.220358 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.227164 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4nzjb" podStartSLOduration=66.227146898 podStartE2EDuration="1m6.227146898s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:02.292580819 +0000 UTC m=+104.380893612" watchObservedRunningTime="2026-03-20 12:29:03.227146898 +0000 UTC m=+105.315459681" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.228257 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.554360 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:03 crc kubenswrapper[4817]: E0320 12:29:03.554605 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:29:03 crc kubenswrapper[4817]: E0320 12:29:03.554712 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs podName:c9cb3896-c4ff-4ccb-b494-eac8b4460342 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.554690007 +0000 UTC m=+113.643002800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs") pod "network-metrics-daemon-xq7wp" (UID: "c9cb3896-c4ff-4ccb-b494-eac8b4460342") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.662921 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.662927 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:03 crc kubenswrapper[4817]: E0320 12:29:03.663047 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:29:03 crc kubenswrapper[4817]: E0320 12:29:03.663151 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:29:03 crc kubenswrapper[4817]: I0320 12:29:03.841983 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xq7wp"] Mar 20 12:29:04 crc kubenswrapper[4817]: I0320 12:29:04.190941 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65mlb" event={"ID":"8fa4d382-e164-4f94-b782-a7dd98fa4460","Type":"ContainerStarted","Data":"f58ee08749eac11c830f334113da4a65dd29bf7db0bc8517b1fb26082f7609c5"} Mar 20 12:29:04 crc kubenswrapper[4817]: I0320 12:29:04.190973 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:04 crc kubenswrapper[4817]: E0320 12:29:04.191082 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:29:04 crc kubenswrapper[4817]: I0320 12:29:04.211819 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-65mlb" podStartSLOduration=66.211803351 podStartE2EDuration="1m6.211803351s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:04.210807013 +0000 UTC m=+106.299119816" watchObservedRunningTime="2026-03-20 12:29:04.211803351 +0000 UTC m=+106.300116124" Mar 20 12:29:04 crc kubenswrapper[4817]: I0320 12:29:04.663089 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:04 crc kubenswrapper[4817]: I0320 12:29:04.663141 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:04 crc kubenswrapper[4817]: E0320 12:29:04.663287 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:29:04 crc kubenswrapper[4817]: E0320 12:29:04.663388 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:29:05 crc kubenswrapper[4817]: I0320 12:29:05.662823 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:05 crc kubenswrapper[4817]: I0320 12:29:05.662871 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:05 crc kubenswrapper[4817]: E0320 12:29:05.662958 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xq7wp" podUID="c9cb3896-c4ff-4ccb-b494-eac8b4460342" Mar 20 12:29:05 crc kubenswrapper[4817]: E0320 12:29:05.663100 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.663399 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:06 crc kubenswrapper[4817]: E0320 12:29:06.663530 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.663421 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:06 crc kubenswrapper[4817]: E0320 12:29:06.663622 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.803333 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.803531 4817 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.848012 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.849002 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.849753 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbltz"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.850710 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.851716 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-crt6p"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.852407 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.852943 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.853275 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.853955 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.857322 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.857416 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.857614 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cw976"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.859989 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.861170 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.862147 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jnf2"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.862172 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.874111 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.876199 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.876694 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.877571 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8h755"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.878095 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.878449 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.878837 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h694w"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.879699 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.880110 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.880507 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.887894 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.888552 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.888580 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.888825 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.888896 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.888944 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.889108 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.889225 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.889570 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.889688 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.889783 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.890504 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.890892 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.891395 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.891591 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.891833 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.891986 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mzrc4"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.892838 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.893154 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.893331 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.893561 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.893717 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.893847 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.901622 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.901723 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.901862 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.902012 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.902294 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.902536 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.903648 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.903896 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.904019 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.904156 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.904284 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.907635 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909046 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6wkw"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909114 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909477 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909207 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909266 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.913067 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2jblp"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.913275 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909274 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.913499 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909312 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909311 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909337 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909359 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.913993 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cwss"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.914250 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909387 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.914453 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909421 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909462 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909522 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909529 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.904300 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909575 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909590 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909683 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909694 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909741 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909780 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909826 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909914 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909923 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.909967 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910043 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910155 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910152 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910249 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910284 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.910331 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.911518 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.911808 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.912349 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.912390 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.930189 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.912529 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.932114 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cw976"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.932411 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.933362 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.936503 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.936643 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.936680 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.936820 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.937005 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.937317 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.937531 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.937844 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.939640 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.939911 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.942836 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-crt6p"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.957199 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.957509 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.957800 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.957911 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.958450 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.958561 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.958940 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.959419 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.959540 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jnf2"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.960179 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.960918 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.961316 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.962719 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.963289 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.966947 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.967138 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.971114 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.971369 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9lk4"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.971886 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.972266 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.972959 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.973223 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.973365 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.973685 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.973873 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.973880 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.974580 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.976334 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.976511 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.976685 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.977369 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.978397 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.978735 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.979331 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.979796 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.980011 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.982265 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.982572 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.983005 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw"] Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.983755 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:06 crc kubenswrapper[4817]: I0320 12:29:06.985814 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:06.999279 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.000585 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002023 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a28cb-8f75-48c6-8980-8a1003ffba98-serving-cert\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1efe6fbf-e6da-4413-ab32-6c457572e894-metrics-tls\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002089 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-machine-approver-tls\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002130 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002167 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjn6d\" (UniqueName: \"kubernetes.io/projected/da4ab27a-0064-4501-9475-4eecdd9ffbcc-kube-api-access-qjn6d\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002187 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-trusted-ca-bundle\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002208 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002230 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002251 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002268 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4eb359-67fa-4b25-b909-a21b16b5be2d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002291 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-image-import-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002311 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002330 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002348 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002366 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljqm\" (UniqueName: \"kubernetes.io/projected/df4eb359-67fa-4b25-b909-a21b16b5be2d-kube-api-access-cljqm\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002385 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9414873-8b3d-4dfb-94a2-604382a77729-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002407 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-node-pullsecrets\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002424 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-oauth-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002458 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcl2\" (UniqueName: \"kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002487 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-dir\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002505 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002542 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002571 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-encryption-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002591 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-config\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002607 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-service-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002659 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002678 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c7c173-9fcc-415c-a13e-9290bc4e5735-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002697 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nzmm\" (UniqueName: \"kubernetes.io/projected/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-kube-api-access-2nzmm\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-serving-cert\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002735 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-images\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002753 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vxr\" (UniqueName: \"kubernetes.io/projected/1efe6fbf-e6da-4413-ab32-6c457572e894-kube-api-access-78vxr\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002773 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkth\" (UniqueName: \"kubernetes.io/projected/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-kube-api-access-rmkth\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002815 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002835 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-config\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002852 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-client\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002873 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-serving-cert\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002912 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-serving-cert\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002932 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxcr\" (UniqueName: \"kubernetes.io/projected/2713a227-3462-4d3d-86ff-6cb0101ef6be-kube-api-access-pnxcr\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit-dir\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002966 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-trusted-ca\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.002987 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003018 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003045 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003085 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl2wv\" (UniqueName: \"kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003109 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4eb359-67fa-4b25-b909-a21b16b5be2d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003671 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vlpm\" (UniqueName: \"kubernetes.io/projected/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-kube-api-access-6vlpm\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003703 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9414873-8b3d-4dfb-94a2-604382a77729-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003777 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvlpg\" (UniqueName: \"kubernetes.io/projected/05c7c173-9fcc-415c-a13e-9290bc4e5735-kube-api-access-fvlpg\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003816 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-client\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003838 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66qw\" (UniqueName: \"kubernetes.io/projected/9f644c83-90e3-4eb1-80b0-82781b255d15-kube-api-access-s66qw\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003855 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-client\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003875 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003920 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4ab27a-0064-4501-9475-4eecdd9ffbcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003945 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003962 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.003991 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-config\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004011 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-policies\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004029 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004049 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25t8n\" (UniqueName: \"kubernetes.io/projected/e845d329-b1ce-48a9-8088-cbb4aabe49e4-kube-api-access-25t8n\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004081 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004102 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004144 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004163 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rb7w\" (UniqueName: \"kubernetes.io/projected/b64a28cb-8f75-48c6-8980-8a1003ffba98-kube-api-access-5rb7w\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004182 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-encryption-config\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004199 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-serving-cert\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004216 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-serving-cert\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004235 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7qhp\" (UniqueName: \"kubernetes.io/projected/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-kube-api-access-h7qhp\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004252 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004519 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7mf\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-kube-api-access-hx7mf\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004546 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-auth-proxy-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004573 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004610 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004665 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004673 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004697 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-config\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004718 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-service-ca\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004741 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004759 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e845d329-b1ce-48a9-8088-cbb4aabe49e4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004780 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004799 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004822 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69l2w\" (UniqueName: \"kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004843 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-oauth-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004870 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65mj\" (UniqueName: \"kubernetes.io/projected/9351978b-90a5-48f6-ba2b-68e2c4f2c574-kube-api-access-k65mj\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004904 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbrp\" (UniqueName: \"kubernetes.io/projected/928ba203-a815-4c6d-9097-e1eafd194ab0-kube-api-access-xhbrp\") pod \"downloads-7954f5f757-2jblp\" (UID: \"928ba203-a815-4c6d-9097-e1eafd194ab0\") " pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.004977 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e845d329-b1ce-48a9-8088-cbb4aabe49e4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.010666 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.010821 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.012639 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.012670 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nb552"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.013775 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.027518 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.028322 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.029046 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.032251 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.032301 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8h755"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.042495 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h694w"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.042556 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.043239 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.043338 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.043700 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.047179 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-j7kqr"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.047976 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.048172 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.048470 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.048736 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6pc55"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.049200 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.051765 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.051946 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.052282 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.053105 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.053605 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.053811 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.056431 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rwqck"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.057090 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.057876 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.057977 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.058798 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.060180 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.061734 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2jblp"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.069180 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.071658 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.077833 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.079755 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.081427 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.089674 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.091024 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cwss"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.094566 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.097691 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.099279 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.101356 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.102471 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mzrc4"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106435 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106466 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-images\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106489 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106506 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106543 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65mj\" (UniqueName: \"kubernetes.io/projected/9351978b-90a5-48f6-ba2b-68e2c4f2c574-kube-api-access-k65mj\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106561 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b89aea9e-c134-42e3-b366-067d746cf7d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106596 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e845d329-b1ce-48a9-8088-cbb4aabe49e4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106614 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-machine-approver-tls\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106631 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbhln\" (UniqueName: \"kubernetes.io/projected/413aaaf0-f437-4a19-845a-6b5c1bfccd07-kube-api-access-lbhln\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106649 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjn6d\" (UniqueName: \"kubernetes.io/projected/da4ab27a-0064-4501-9475-4eecdd9ffbcc-kube-api-access-qjn6d\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106667 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l6xd\" (UniqueName: \"kubernetes.io/projected/6bdeca1d-6768-4d19-a080-885b98c47f5b-kube-api-access-6l6xd\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106689 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106706 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-image-import-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106740 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106756 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106773 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106789 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cljqm\" (UniqueName: \"kubernetes.io/projected/df4eb359-67fa-4b25-b909-a21b16b5be2d-kube-api-access-cljqm\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106807 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9414873-8b3d-4dfb-94a2-604382a77729-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106831 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-node-pullsecrets\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106848 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-oauth-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106865 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcl2\" (UniqueName: \"kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106885 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106903 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106919 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489bn\" (UniqueName: \"kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106935 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/413aaaf0-f437-4a19-845a-6b5c1bfccd07-proxy-tls\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106954 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-encryption-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106976 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-config\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.106997 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107012 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-service-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107028 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107057 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nzmm\" (UniqueName: \"kubernetes.io/projected/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-kube-api-access-2nzmm\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107074 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107091 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c7c173-9fcc-415c-a13e-9290bc4e5735-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107106 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-images\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vxr\" (UniqueName: \"kubernetes.io/projected/1efe6fbf-e6da-4413-ab32-6c457572e894-kube-api-access-78vxr\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107156 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9148329e-b499-499d-a1a6-b1ff1368de6c-service-ca-bundle\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107192 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f2d38b-934b-4185-97ab-f43bfcbad479-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107214 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107231 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107255 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkth\" (UniqueName: \"kubernetes.io/projected/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-kube-api-access-rmkth\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107272 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-config\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107288 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-client\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107323 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-serving-cert\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107340 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit-dir\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107357 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdddv\" (UniqueName: \"kubernetes.io/projected/17ba5d71-5532-4d72-9505-9bab643bbd40-kube-api-access-rdddv\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107374 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98722637-bcd0-4008-ad86-2a5a7e129b34-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107390 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98722637-bcd0-4008-ad86-2a5a7e129b34-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107405 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmz8h\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-kube-api-access-cmz8h\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107423 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl2wv\" (UniqueName: \"kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107439 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4eb359-67fa-4b25-b909-a21b16b5be2d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107455 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107471 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b89aea9e-c134-42e3-b366-067d746cf7d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107485 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107501 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvlpg\" (UniqueName: \"kubernetes.io/projected/05c7c173-9fcc-415c-a13e-9290bc4e5735-kube-api-access-fvlpg\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107524 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-client\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107540 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s66qw\" (UniqueName: \"kubernetes.io/projected/9f644c83-90e3-4eb1-80b0-82781b255d15-kube-api-access-s66qw\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107554 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107575 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d152b635-3488-428c-b8bd-b28e7fe13bef-config\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107590 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107607 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkscb\" (UniqueName: \"kubernetes.io/projected/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-kube-api-access-wkscb\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107623 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107639 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25t8n\" (UniqueName: \"kubernetes.io/projected/e845d329-b1ce-48a9-8088-cbb4aabe49e4-kube-api-access-25t8n\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107654 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107672 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv6tm\" (UniqueName: \"kubernetes.io/projected/85d5ecf6-da8d-4953-9b78-7ba019986d37-kube-api-access-fv6tm\") pod \"migrator-59844c95c7-7vdpg\" (UID: \"85d5ecf6-da8d-4953-9b78-7ba019986d37\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107687 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107702 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-metrics-certs\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107718 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-policies\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107742 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107758 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107774 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107790 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107804 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3942ee53-6987-42c9-85ed-e5b799a1555d-metrics-tls\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107820 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107836 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-encryption-config\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107851 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7qhp\" (UniqueName: \"kubernetes.io/projected/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-kube-api-access-h7qhp\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107867 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107885 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgzm\" (UniqueName: \"kubernetes.io/projected/344f693a-912d-41fe-a9f8-c344e2770d08-kube-api-access-7pgzm\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107902 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7mf\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-kube-api-access-hx7mf\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107917 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-auth-proxy-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107934 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mhl\" (UniqueName: \"kubernetes.io/projected/1907dc62-9c82-492c-a7b9-642410848e1f-kube-api-access-h2mhl\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107950 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v896\" (UniqueName: \"kubernetes.io/projected/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-kube-api-access-8v896\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107966 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a46bd9-a747-4587-9333-f10f27e9be52-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107981 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00a46bd9-a747-4587-9333-f10f27e9be52-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.107998 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108014 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108031 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108048 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-config\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108067 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-stats-auth\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108088 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-service-ca\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108106 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69l2w\" (UniqueName: \"kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e845d329-b1ce-48a9-8088-cbb4aabe49e4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108155 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3942ee53-6987-42c9-85ed-e5b799a1555d-trusted-ca\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108170 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108186 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-oauth-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108217 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108234 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhbrp\" (UniqueName: \"kubernetes.io/projected/928ba203-a815-4c6d-9097-e1eafd194ab0-kube-api-access-xhbrp\") pod \"downloads-7954f5f757-2jblp\" (UID: \"928ba203-a815-4c6d-9097-e1eafd194ab0\") " pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108240 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit-dir\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.109035 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4eb359-67fa-4b25-b909-a21b16b5be2d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.109206 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-config\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.109709 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.110562 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.114305 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.114354 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9lk4"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.116943 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.117607 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6wkw"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.118084 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.118418 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-policies\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.118852 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.119059 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.119098 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.119388 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.119435 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.108250 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1efe6fbf-e6da-4413-ab32-6c457572e894-metrics-tls\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.119963 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-config\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.120361 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-service-ca\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.120367 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-node-pullsecrets\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.121026 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-audit\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.121442 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.121439 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4ab27a-0064-4501-9475-4eecdd9ffbcc-images\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.121591 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-auth-proxy-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.121658 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a28cb-8f75-48c6-8980-8a1003ffba98-serving-cert\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.127295 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.122938 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-service-ca\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.126293 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.126959 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.126975 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b64a28cb-8f75-48c6-8980-8a1003ffba98-serving-cert\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.127411 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.127747 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-trusted-ca-bundle\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.122459 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e845d329-b1ce-48a9-8088-cbb4aabe49e4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.127713 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e845d329-b1ce-48a9-8088-cbb4aabe49e4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.128238 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.128433 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-image-import-ca\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.128950 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-oauth-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.129027 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.129067 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.131750 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9351978b-90a5-48f6-ba2b-68e2c4f2c574-trusted-ca-bundle\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.131811 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.131838 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05c7c173-9fcc-415c-a13e-9290bc4e5735-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.132050 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.132239 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-etcd-client\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.132310 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.132662 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.133490 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-serving-cert\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.133910 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-oauth-config\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.134049 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1efe6fbf-e6da-4413-ab32-6c457572e894-metrics-tls\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.134145 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbltz"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.134592 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.129160 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4eb359-67fa-4b25-b909-a21b16b5be2d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.134966 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0f2d38b-934b-4185-97ab-f43bfcbad479-proxy-tls\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.135337 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d152b635-3488-428c-b8bd-b28e7fe13bef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.135348 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9414873-8b3d-4dfb-94a2-604382a77729-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.135643 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.135962 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.136358 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-dir\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.136398 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.136664 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.136802 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137110 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137288 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swp4s\" (UniqueName: \"kubernetes.io/projected/479d3864-1c15-4044-84b2-376cc2b603b6-kube-api-access-swp4s\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137413 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-serving-cert\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137388 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137562 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2713a227-3462-4d3d-86ff-6cb0101ef6be-audit-dir\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137519 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137773 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a46bd9-a747-4587-9333-f10f27e9be52-config\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.137819 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89aea9e-c134-42e3-b366-067d746cf7d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-serving-cert\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138271 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr49l\" (UniqueName: \"kubernetes.io/projected/b5314572-4537-46f6-a0f3-08112dd1d556-kube-api-access-zr49l\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138471 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-trusted-ca\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138630 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxcr\" (UniqueName: \"kubernetes.io/projected/2713a227-3462-4d3d-86ff-6cb0101ef6be-kube-api-access-pnxcr\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.138756 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckzg6\" (UniqueName: \"kubernetes.io/projected/9148329e-b499-499d-a1a6-b1ff1368de6c-kube-api-access-ckzg6\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139150 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139367 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139381 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139555 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vlpm\" (UniqueName: \"kubernetes.io/projected/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-kube-api-access-6vlpm\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139685 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4eb359-67fa-4b25-b909-a21b16b5be2d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.139711 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.140185 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.140634 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-trusted-ca\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.140927 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142006 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142057 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-serving-cert\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142287 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9414873-8b3d-4dfb-94a2-604382a77729-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142663 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4ab27a-0064-4501-9475-4eecdd9ffbcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142842 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-client\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.142906 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.143085 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.143184 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.143300 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1907dc62-9c82-492c-a7b9-642410848e1f-tmpfs\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.144105 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.144141 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.145477 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zb66q"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.146192 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-config\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.146305 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d152b635-3488-428c-b8bd-b28e7fe13bef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.146362 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzds\" (UniqueName: \"kubernetes.io/projected/98722637-bcd0-4008-ad86-2a5a7e129b34-kube-api-access-7vzds\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.146823 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rb7w\" (UniqueName: \"kubernetes.io/projected/b64a28cb-8f75-48c6-8980-8a1003ffba98-kube-api-access-5rb7w\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.146948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-default-certificate\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147066 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147238 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b64a28cb-8f75-48c6-8980-8a1003ffba98-config\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147201 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5t6\" (UniqueName: \"kubernetes.io/projected/d0f2d38b-934b-4185-97ab-f43bfcbad479-kube-api-access-dz5t6\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147682 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-serving-cert\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147828 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc678\" (UniqueName: \"kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147889 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-serving-cert\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147918 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9414873-8b3d-4dfb-94a2-604382a77729-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147952 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.147981 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.148008 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.148875 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2713a227-3462-4d3d-86ff-6cb0101ef6be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.150880 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.151251 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-encryption-config\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.151540 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-etcd-client\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.152312 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-encryption-config\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.152500 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4ab27a-0064-4501-9475-4eecdd9ffbcc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153163 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153443 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-55j94"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153483 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153542 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2713a227-3462-4d3d-86ff-6cb0101ef6be-etcd-client\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153841 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-serving-cert\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.153952 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9351978b-90a5-48f6-ba2b-68e2c4f2c574-console-serving-cert\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.155937 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-machine-approver-tls\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.156744 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.156858 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.157540 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.158634 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.158837 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f644c83-90e3-4eb1-80b0-82781b255d15-serving-cert\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.159517 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.159649 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.160037 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-serving-cert\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.160671 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.161910 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.162934 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.164064 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.165099 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6pc55"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.166186 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zb66q"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.167632 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-55j94"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.169561 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-config\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.170572 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-76blx"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.171827 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76blx" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.173104 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76blx"] Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.176816 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.182901 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.191814 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.200392 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-config\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.212172 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248707 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swp4s\" (UniqueName: \"kubernetes.io/projected/479d3864-1c15-4044-84b2-376cc2b603b6-kube-api-access-swp4s\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248790 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248818 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a46bd9-a747-4587-9333-f10f27e9be52-config\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248841 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89aea9e-c134-42e3-b366-067d746cf7d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248868 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr49l\" (UniqueName: \"kubernetes.io/projected/b5314572-4537-46f6-a0f3-08112dd1d556-kube-api-access-zr49l\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248900 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckzg6\" (UniqueName: \"kubernetes.io/projected/9148329e-b499-499d-a1a6-b1ff1368de6c-kube-api-access-ckzg6\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248923 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248965 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.248989 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1907dc62-9c82-492c-a7b9-642410848e1f-tmpfs\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249016 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d152b635-3488-428c-b8bd-b28e7fe13bef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzds\" (UniqueName: \"kubernetes.io/projected/98722637-bcd0-4008-ad86-2a5a7e129b34-kube-api-access-7vzds\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249084 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-default-certificate\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249106 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5t6\" (UniqueName: \"kubernetes.io/projected/d0f2d38b-934b-4185-97ab-f43bfcbad479-kube-api-access-dz5t6\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249169 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc678\" (UniqueName: \"kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249190 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-images\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249219 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249257 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b89aea9e-c134-42e3-b366-067d746cf7d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249285 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbhln\" (UniqueName: \"kubernetes.io/projected/413aaaf0-f437-4a19-845a-6b5c1bfccd07-kube-api-access-lbhln\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249340 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l6xd\" (UniqueName: \"kubernetes.io/projected/6bdeca1d-6768-4d19-a080-885b98c47f5b-kube-api-access-6l6xd\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249380 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249430 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249469 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/413aaaf0-f437-4a19-845a-6b5c1bfccd07-proxy-tls\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249494 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-489bn\" (UniqueName: \"kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249527 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249550 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9148329e-b499-499d-a1a6-b1ff1368de6c-service-ca-bundle\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249574 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f2d38b-934b-4185-97ab-f43bfcbad479-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249586 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1907dc62-9c82-492c-a7b9-642410848e1f-tmpfs\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98722637-bcd0-4008-ad86-2a5a7e129b34-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249643 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmz8h\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-kube-api-access-cmz8h\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249666 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdddv\" (UniqueName: \"kubernetes.io/projected/17ba5d71-5532-4d72-9505-9bab643bbd40-kube-api-access-rdddv\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249689 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98722637-bcd0-4008-ad86-2a5a7e129b34-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b89aea9e-c134-42e3-b366-067d746cf7d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249779 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249814 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d152b635-3488-428c-b8bd-b28e7fe13bef-config\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249840 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249875 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkscb\" (UniqueName: \"kubernetes.io/projected/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-kube-api-access-wkscb\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249901 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv6tm\" (UniqueName: \"kubernetes.io/projected/85d5ecf6-da8d-4953-9b78-7ba019986d37-kube-api-access-fv6tm\") pod \"migrator-59844c95c7-7vdpg\" (UID: \"85d5ecf6-da8d-4953-9b78-7ba019986d37\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249923 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249949 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-metrics-certs\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.249983 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250017 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3942ee53-6987-42c9-85ed-e5b799a1555d-metrics-tls\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250038 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250058 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250082 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgzm\" (UniqueName: \"kubernetes.io/projected/344f693a-912d-41fe-a9f8-c344e2770d08-kube-api-access-7pgzm\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v896\" (UniqueName: \"kubernetes.io/projected/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-kube-api-access-8v896\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250163 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mhl\" (UniqueName: \"kubernetes.io/projected/1907dc62-9c82-492c-a7b9-642410848e1f-kube-api-access-h2mhl\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250183 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250221 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a46bd9-a747-4587-9333-f10f27e9be52-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250242 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00a46bd9-a747-4587-9333-f10f27e9be52-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250286 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-stats-auth\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250306 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3942ee53-6987-42c9-85ed-e5b799a1555d-trusted-ca\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250331 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250392 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0f2d38b-934b-4185-97ab-f43bfcbad479-proxy-tls\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250431 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d152b635-3488-428c-b8bd-b28e7fe13bef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250537 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.250905 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f2d38b-934b-4185-97ab-f43bfcbad479-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.251524 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.271344 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.280501 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98722637-bcd0-4008-ad86-2a5a7e129b34-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.291234 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.311942 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.322678 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98722637-bcd0-4008-ad86-2a5a7e129b34-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.331750 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.351540 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.371562 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.391890 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.405490 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b89aea9e-c134-42e3-b366-067d746cf7d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.411767 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.420634 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89aea9e-c134-42e3-b366-067d746cf7d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.430708 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.446429 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0f2d38b-934b-4185-97ab-f43bfcbad479-proxy-tls\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.451324 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.464366 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.471515 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.484088 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.492706 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.513856 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.531999 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.552095 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.564300 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.571552 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.599834 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.611200 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.611368 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.632136 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.651378 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.663275 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.663353 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.672055 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.692539 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.712207 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.732067 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.751196 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.772238 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.791717 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.811112 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.825477 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d152b635-3488-428c-b8bd-b28e7fe13bef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.832878 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.841306 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d152b635-3488-428c-b8bd-b28e7fe13bef-config\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.861364 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.871573 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.872664 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3942ee53-6987-42c9-85ed-e5b799a1555d-trusted-ca\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.881197 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/413aaaf0-f437-4a19-845a-6b5c1bfccd07-images\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.891869 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.910972 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.924745 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3942ee53-6987-42c9-85ed-e5b799a1555d-metrics-tls\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.932209 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.952004 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.971622 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 12:29:07 crc kubenswrapper[4817]: I0320 12:29:07.992605 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.004194 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/413aaaf0-f437-4a19-845a-6b5c1bfccd07-proxy-tls\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.012711 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.026694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-stats-auth\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.029917 4817 request.go:700] Waited for 1.015886034s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-dockercfg-zdk86&limit=500&resourceVersion=0 Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.032680 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.050906 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.070697 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-metrics-certs\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.071555 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.091830 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.101177 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9148329e-b499-499d-a1a6-b1ff1368de6c-service-ca-bundle\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.112290 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.131512 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.145417 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9148329e-b499-499d-a1a6-b1ff1368de6c-default-certificate\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.153136 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.165556 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a46bd9-a747-4587-9333-f10f27e9be52-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.171877 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.180178 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a46bd9-a747-4587-9333-f10f27e9be52-config\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.192634 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.211775 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.231893 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249653 4817 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249715 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249754 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs podName:b5314572-4537-46f6-a0f3-08112dd1d556 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.749734406 +0000 UTC m=+110.838047189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs") pod "machine-config-server-j7kqr" (UID: "b5314572-4537-46f6-a0f3-08112dd1d556") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249855 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume podName:b6ef9afb-80f9-48dd-b41d-47874fcf3be9 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.749816748 +0000 UTC m=+110.838129571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume") pod "collect-profiles-29566815-k5qhj" (UID: "b6ef9afb-80f9-48dd-b41d-47874fcf3be9") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249904 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250191 4817 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250264 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert podName:17ba5d71-5532-4d72-9505-9bab643bbd40 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.75023721 +0000 UTC m=+110.838550063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert") pod "service-ca-operator-777779d784-tzxvg" (UID: "17ba5d71-5532-4d72-9505-9bab643bbd40") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249943 4817 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250343 4817 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.249974 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250077 4817 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250419 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume podName:b6ef9afb-80f9-48dd-b41d-47874fcf3be9 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.750397784 +0000 UTC m=+110.838710607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume") pod "collect-profiles-29566815-k5qhj" (UID: "b6ef9afb-80f9-48dd-b41d-47874fcf3be9") : failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250512 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250552 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config podName:17ba5d71-5532-4d72-9505-9bab643bbd40 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.750515557 +0000 UTC m=+110.838828380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config") pod "service-ca-operator-777779d784-tzxvg" (UID: "17ba5d71-5532-4d72-9505-9bab643bbd40") : failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250581 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert podName:344f693a-912d-41fe-a9f8-c344e2770d08 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.750566659 +0000 UTC m=+110.838879482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-v8vzq" (UID: "344f693a-912d-41fe-a9f8-c344e2770d08") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250608 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token podName:b5314572-4537-46f6-a0f3-08112dd1d556 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.75059471 +0000 UTC m=+110.838907543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token") pod "machine-config-server-j7kqr" (UID: "b5314572-4537-46f6-a0f3-08112dd1d556") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250621 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250680 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250638 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert podName:6bdeca1d-6768-4d19-a080-885b98c47f5b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.7506257 +0000 UTC m=+110.838938523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert") pod "olm-operator-6b444d44fb-zqvg6" (UID: "6bdeca1d-6768-4d19-a080-885b98c47f5b") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250659 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250845 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert podName:1907dc62-9c82-492c-a7b9-642410848e1f nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.750750304 +0000 UTC m=+110.839063117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert") pod "packageserver-d55dfcdfc-cl2qb" (UID: "1907dc62-9c82-492c-a7b9-642410848e1f") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250865 4817 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.250948 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert podName:479d3864-1c15-4044-84b2-376cc2b603b6 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.750872257 +0000 UTC m=+110.839185170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert") pod "catalog-operator-68c6474976-tw2rl" (UID: "479d3864-1c15-4044-84b2-376cc2b603b6") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.251055 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert podName:479d3864-1c15-4044-84b2-376cc2b603b6 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.75097559 +0000 UTC m=+110.839288403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert") pod "catalog-operator-68c6474976-tw2rl" (UID: "479d3864-1c15-4044-84b2-376cc2b603b6") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.251198 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert podName:6bdeca1d-6768-4d19-a080-885b98c47f5b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.751175386 +0000 UTC m=+110.839488229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert") pod "olm-operator-6b444d44fb-zqvg6" (UID: "6bdeca1d-6768-4d19-a080-885b98c47f5b") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.251234 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert podName:1907dc62-9c82-492c-a7b9-642410848e1f nodeName:}" failed. No retries permitted until 2026-03-20 12:29:08.751218947 +0000 UTC m=+110.839531770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert") pod "packageserver-d55dfcdfc-cl2qb" (UID: "1907dc62-9c82-492c-a7b9-642410848e1f") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.254489 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.272247 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.291337 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.311936 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.332167 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.351471 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.370780 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.391440 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.411709 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.431972 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.451787 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.471519 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.474907 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.475178 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:24.475143518 +0000 UTC m=+126.563456321 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.475321 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.475519 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.475618 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.475759 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.475777 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.475789 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.475847 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:24.475833737 +0000 UTC m=+126.564146520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.475923 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.476084 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.476103 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.476117 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:08 crc kubenswrapper[4817]: E0320 12:29:08.476215 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:24.476201558 +0000 UTC m=+126.564514351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.490645 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.512372 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.531109 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.553002 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.571198 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.592018 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.611707 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.631820 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.652381 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.662956 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.662957 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.672399 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.691355 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.746038 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl2wv\" (UniqueName: \"kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv\") pod \"oauth-openshift-558db77b4-v6wkw\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.768298 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65mj\" (UniqueName: \"kubernetes.io/projected/9351978b-90a5-48f6-ba2b-68e2c4f2c574-kube-api-access-k65mj\") pod \"console-f9d7485db-mzrc4\" (UID: \"9351978b-90a5-48f6-ba2b-68e2c4f2c574\") " pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.780798 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781013 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781113 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781215 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781292 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781327 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781440 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781475 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781520 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781567 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.781618 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.782357 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.782399 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.782493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.782896 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ba5d71-5532-4d72-9505-9bab643bbd40-config\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.784549 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-profile-collector-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.785038 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-node-bootstrap-token\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.785226 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-webhook-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.785820 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.786771 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/344f693a-912d-41fe-a9f8-c344e2770d08-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.786849 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1907dc62-9c82-492c-a7b9-642410848e1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.787183 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ba5d71-5532-4d72-9505-9bab643bbd40-serving-cert\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.787894 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.787992 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/479d3864-1c15-4044-84b2-376cc2b603b6-srv-cert\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.789010 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bdeca1d-6768-4d19-a080-885b98c47f5b-srv-cert\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.789289 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvlpg\" (UniqueName: \"kubernetes.io/projected/05c7c173-9fcc-415c-a13e-9290bc4e5735-kube-api-access-fvlpg\") pod \"cluster-samples-operator-665b6dd947-n45lz\" (UID: \"05c7c173-9fcc-415c-a13e-9290bc4e5735\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.789461 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b5314572-4537-46f6-a0f3-08112dd1d556-certs\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.809037 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66qw\" (UniqueName: \"kubernetes.io/projected/9f644c83-90e3-4eb1-80b0-82781b255d15-kube-api-access-s66qw\") pod \"etcd-operator-b45778765-2jnf2\" (UID: \"9f644c83-90e3-4eb1-80b0-82781b255d15\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.826171 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nzmm\" (UniqueName: \"kubernetes.io/projected/e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb-kube-api-access-2nzmm\") pod \"machine-approver-56656f9798-bgdpq\" (UID: \"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.829055 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.847130 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vxr\" (UniqueName: \"kubernetes.io/projected/1efe6fbf-e6da-4413-ab32-6c457572e894-kube-api-access-78vxr\") pod \"dns-operator-744455d44c-cw976\" (UID: \"1efe6fbf-e6da-4413-ab32-6c457572e894\") " pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.867640 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.868590 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25t8n\" (UniqueName: \"kubernetes.io/projected/e845d329-b1ce-48a9-8088-cbb4aabe49e4-kube-api-access-25t8n\") pod \"openshift-apiserver-operator-796bbdcf4f-6lwrj\" (UID: \"e845d329-b1ce-48a9-8088-cbb4aabe49e4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.876814 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" Mar 20 12:29:08 crc kubenswrapper[4817]: W0320 12:29:08.884654 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode819d4db_e0d9_4bc4_afe2_7cea81b1d0fb.slice/crio-d7ee2b82912a7d5b924e1a52c742a57d379892356ad04484d54cc7c2ea7dbb06 WatchSource:0}: Error finding container d7ee2b82912a7d5b924e1a52c742a57d379892356ad04484d54cc7c2ea7dbb06: Status 404 returned error can't find the container with id d7ee2b82912a7d5b924e1a52c742a57d379892356ad04484d54cc7c2ea7dbb06 Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.886944 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkth\" (UniqueName: \"kubernetes.io/projected/b432f9f8-8e62-4ba9-b852-3e670ea6cdbe-kube-api-access-rmkth\") pod \"openshift-config-operator-7777fb866f-h694w\" (UID: \"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.895827 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.915838 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69l2w\" (UniqueName: \"kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w\") pod \"route-controller-manager-6576b87f9c-vt6cg\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.931486 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7qhp\" (UniqueName: \"kubernetes.io/projected/b3057700-8d5d-4e2e-8e0c-66490bfe55c5-kube-api-access-h7qhp\") pod \"apiserver-76f77b778f-gbltz\" (UID: \"b3057700-8d5d-4e2e-8e0c-66490bfe55c5\") " pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.933818 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:08 crc kubenswrapper[4817]: I0320 12:29:08.952635 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljqm\" (UniqueName: \"kubernetes.io/projected/df4eb359-67fa-4b25-b909-a21b16b5be2d-kube-api-access-cljqm\") pod \"openshift-controller-manager-operator-756b6f6bc6-r5qk5\" (UID: \"df4eb359-67fa-4b25-b909-a21b16b5be2d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.002608 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.011604 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.017260 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcl2\" (UniqueName: \"kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2\") pod \"controller-manager-879f6c89f-jsgjh\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.017597 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.018747 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7mf\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-kube-api-access-hx7mf\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.028270 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhbrp\" (UniqueName: \"kubernetes.io/projected/928ba203-a815-4c6d-9097-e1eafd194ab0-kube-api-access-xhbrp\") pod \"downloads-7954f5f757-2jblp\" (UID: \"928ba203-a815-4c6d-9097-e1eafd194ab0\") " pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.032928 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjn6d\" (UniqueName: \"kubernetes.io/projected/da4ab27a-0064-4501-9475-4eecdd9ffbcc-kube-api-access-qjn6d\") pod \"machine-api-operator-5694c8668f-crt6p\" (UID: \"da4ab27a-0064-4501-9475-4eecdd9ffbcc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.050017 4817 request.go:700] Waited for 1.910958902s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.053180 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9414873-8b3d-4dfb-94a2-604382a77729-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hnqtm\" (UID: \"a9414873-8b3d-4dfb-94a2-604382a77729\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.069827 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxcr\" (UniqueName: \"kubernetes.io/projected/2713a227-3462-4d3d-86ff-6cb0101ef6be-kube-api-access-pnxcr\") pod \"apiserver-7bbb656c7d-bwlwt\" (UID: \"2713a227-3462-4d3d-86ff-6cb0101ef6be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.076587 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.092168 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vlpm\" (UniqueName: \"kubernetes.io/projected/4e26a2f6-3ad6-4113-8120-fc0af5b772c6-kube-api-access-6vlpm\") pod \"authentication-operator-69f744f599-9cwss\" (UID: \"4e26a2f6-3ad6-4113-8120-fc0af5b772c6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.109703 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.111157 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.114599 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rb7w\" (UniqueName: \"kubernetes.io/projected/b64a28cb-8f75-48c6-8980-8a1003ffba98-kube-api-access-5rb7w\") pod \"console-operator-58897d9998-8h755\" (UID: \"b64a28cb-8f75-48c6-8980-8a1003ffba98\") " pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.122447 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.136077 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jnf2"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.142199 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.143309 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.153163 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.171019 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.195317 4817 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.212884 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.213087 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" event={"ID":"9f644c83-90e3-4eb1-80b0-82781b255d15","Type":"ContainerStarted","Data":"c320d7645726601c2335008174c870722504e374c845aa082fcc1c1f017644c2"} Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.214029 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" event={"ID":"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb","Type":"ContainerStarted","Data":"d7ee2b82912a7d5b924e1a52c742a57d379892356ad04484d54cc7c2ea7dbb06"} Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.231842 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.253178 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.274897 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.294889 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.295308 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.299109 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.312349 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.320957 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.323538 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.330436 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.354097 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swp4s\" (UniqueName: \"kubernetes.io/projected/479d3864-1c15-4044-84b2-376cc2b603b6-kube-api-access-swp4s\") pod \"catalog-operator-68c6474976-tw2rl\" (UID: \"479d3864-1c15-4044-84b2-376cc2b603b6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.365971 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.367133 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gbltz"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.367791 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckzg6\" (UniqueName: \"kubernetes.io/projected/9148329e-b499-499d-a1a6-b1ff1368de6c-kube-api-access-ckzg6\") pod \"router-default-5444994796-nb552\" (UID: \"9148329e-b499-499d-a1a6-b1ff1368de6c\") " pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.387877 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr49l\" (UniqueName: \"kubernetes.io/projected/b5314572-4537-46f6-a0f3-08112dd1d556-kube-api-access-zr49l\") pod \"machine-config-server-j7kqr\" (UID: \"b5314572-4537-46f6-a0f3-08112dd1d556\") " pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.401060 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3057700_8d5d_4e2e_8e0c_66490bfe55c5.slice/crio-c21a79e6e4a9b23657d412f5e5f8d3fca4009223c7e57d13df58c1bd0a5d7672 WatchSource:0}: Error finding container c21a79e6e4a9b23657d412f5e5f8d3fca4009223c7e57d13df58c1bd0a5d7672: Status 404 returned error can't find the container with id c21a79e6e4a9b23657d412f5e5f8d3fca4009223c7e57d13df58c1bd0a5d7672 Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.410204 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6wkw"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.410237 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mzrc4"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.418379 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.419532 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.438906 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d152b635-3488-428c-b8bd-b28e7fe13bef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqwpw\" (UID: \"d152b635-3488-428c-b8bd-b28e7fe13bef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.454961 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j7kqr" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.455349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5t6\" (UniqueName: \"kubernetes.io/projected/d0f2d38b-934b-4185-97ab-f43bfcbad479-kube-api-access-dz5t6\") pod \"machine-config-controller-84d6567774-qdnmj\" (UID: \"d0f2d38b-934b-4185-97ab-f43bfcbad479\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.460086 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.474439 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc678\" (UniqueName: \"kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678\") pod \"marketplace-operator-79b997595-bqnjh\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:09 crc kubenswrapper[4817]: E0320 12:29:09.476924 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:09 crc kubenswrapper[4817]: E0320 12:29:09.477005 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:25.476984822 +0000 UTC m=+127.565297605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.478465 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:09 crc kubenswrapper[4817]: E0320 12:29:09.481226 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:09 crc kubenswrapper[4817]: E0320 12:29:09.481263 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 12:29:25.481254341 +0000 UTC m=+127.569567124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.483544 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h694w"] Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.487832 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4eb359_67fa_4b25_b909_a21b16b5be2d.slice/crio-397915cbb6f0298e2fa8cb72b72a4a58535afe7047beaf2d6b490e074da214ab WatchSource:0}: Error finding container 397915cbb6f0298e2fa8cb72b72a4a58535afe7047beaf2d6b490e074da214ab: Status 404 returned error can't find the container with id 397915cbb6f0298e2fa8cb72b72a4a58535afe7047beaf2d6b490e074da214ab Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.488899 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbhln\" (UniqueName: \"kubernetes.io/projected/413aaaf0-f437-4a19-845a-6b5c1bfccd07-kube-api-access-lbhln\") pod \"machine-config-operator-74547568cd-hrbgf\" (UID: \"413aaaf0-f437-4a19-845a-6b5c1bfccd07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.512261 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b89aea9e-c134-42e3-b366-067d746cf7d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2z24k\" (UID: \"b89aea9e-c134-42e3-b366-067d746cf7d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.512309 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8h755"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.513313 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.530790 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-489bn\" (UniqueName: \"kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn\") pod \"collect-profiles-29566815-k5qhj\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.539950 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.548455 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzds\" (UniqueName: \"kubernetes.io/projected/98722637-bcd0-4008-ad86-2a5a7e129b34-kube-api-access-7vzds\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mn5j\" (UID: \"98722637-bcd0-4008-ad86-2a5a7e129b34\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.572624 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdddv\" (UniqueName: \"kubernetes.io/projected/17ba5d71-5532-4d72-9505-9bab643bbd40-kube-api-access-rdddv\") pod \"service-ca-operator-777779d784-tzxvg\" (UID: \"17ba5d71-5532-4d72-9505-9bab643bbd40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.593234 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmz8h\" (UniqueName: \"kubernetes.io/projected/3942ee53-6987-42c9-85ed-e5b799a1555d-kube-api-access-cmz8h\") pod \"ingress-operator-5b745b69d9-prq7r\" (UID: \"3942ee53-6987-42c9-85ed-e5b799a1555d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.598918 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.618616 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv6tm\" (UniqueName: \"kubernetes.io/projected/85d5ecf6-da8d-4953-9b78-7ba019986d37-kube-api-access-fv6tm\") pod \"migrator-59844c95c7-7vdpg\" (UID: \"85d5ecf6-da8d-4953-9b78-7ba019986d37\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.626769 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkscb\" (UniqueName: \"kubernetes.io/projected/68eb3b35-c763-4024-8fd0-6dc63ea80eb8-kube-api-access-wkscb\") pod \"control-plane-machine-set-operator-78cbb6b69f-wq2qb\" (UID: \"68eb3b35-c763-4024-8fd0-6dc63ea80eb8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.636588 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.643759 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.646271 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgzm\" (UniqueName: \"kubernetes.io/projected/344f693a-912d-41fe-a9f8-c344e2770d08-kube-api-access-7pgzm\") pod \"package-server-manager-789f6589d5-v8vzq\" (UID: \"344f693a-912d-41fe-a9f8-c344e2770d08\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.651325 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.663428 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.663642 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.668964 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v896\" (UniqueName: \"kubernetes.io/projected/87a729e2-4d7e-4f68-bef5-ef25d4ab5b43-kube-api-access-8v896\") pod \"multus-admission-controller-857f4d67dd-g9lk4\" (UID: \"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.672784 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.672984 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2df950_540c_408e_a555_81b7e7da9e26.slice/crio-c3f13b771077ca9e983dc21573c89eccf2f33e9a49c82d884223486ae67ff094 WatchSource:0}: Error finding container c3f13b771077ca9e983dc21573c89eccf2f33e9a49c82d884223486ae67ff094: Status 404 returned error can't find the container with id c3f13b771077ca9e983dc21573c89eccf2f33e9a49c82d884223486ae67ff094 Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.686230 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5457083_eb9e_4828_839c_a7613592278e.slice/crio-aca45b8bff85d9946fb90ceee630dd670d8881026cf8647befd776e32d866e6a WatchSource:0}: Error finding container aca45b8bff85d9946fb90ceee630dd670d8881026cf8647befd776e32d866e6a: Status 404 returned error can't find the container with id aca45b8bff85d9946fb90ceee630dd670d8881026cf8647befd776e32d866e6a Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.687580 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.687960 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mhl\" (UniqueName: \"kubernetes.io/projected/1907dc62-9c82-492c-a7b9-642410848e1f-kube-api-access-h2mhl\") pod \"packageserver-d55dfcdfc-cl2qb\" (UID: \"1907dc62-9c82-492c-a7b9-642410848e1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.698139 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cw976"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.698974 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.707017 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.713380 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.714276 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00a46bd9-a747-4587-9333-f10f27e9be52-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5rq7n\" (UID: \"00a46bd9-a747-4587-9333-f10f27e9be52\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.721136 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5314572_4537_46f6_a0f3_08112dd1d556.slice/crio-a20cbb31eab336bd192c798c52718be71d80d479b43b3dd17850d142dce0f817 WatchSource:0}: Error finding container a20cbb31eab336bd192c798c52718be71d80d479b43b3dd17850d142dce0f817: Status 404 returned error can't find the container with id a20cbb31eab336bd192c798c52718be71d80d479b43b3dd17850d142dce0f817 Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.725636 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l6xd\" (UniqueName: \"kubernetes.io/projected/6bdeca1d-6768-4d19-a080-885b98c47f5b-kube-api-access-6l6xd\") pod \"olm-operator-6b444d44fb-zqvg6\" (UID: \"6bdeca1d-6768-4d19-a080-885b98c47f5b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.731973 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.733458 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.743308 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.749084 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.750579 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.764346 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.765813 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cwss"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.771237 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.774239 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.784525 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.791078 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.791581 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.812023 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.817572 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2jblp"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.831496 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.899188 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.905738 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-crt6p"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920316 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl"] Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920790 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920851 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920870 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bst9l\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920929 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j586r\" (UniqueName: \"kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920949 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7849m\" (UniqueName: \"kubernetes.io/projected/122a38f4-116c-41d1-926d-7c1c3d6ba167-kube-api-access-7849m\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.920983 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921013 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-key\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921032 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921148 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921167 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921183 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-cabundle\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.921212 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:09 crc kubenswrapper[4817]: E0320 12:29:09.921939 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.421927053 +0000 UTC m=+112.510239836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:09 crc kubenswrapper[4817]: I0320 12:29:09.958048 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.963354 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2713a227_3462_4d3d_86ff_6cb0101ef6be.slice/crio-2927e129b3e55e31f60c95b76ff984a33f02e6757cdd45b1b5b067f706902740 WatchSource:0}: Error finding container 2927e129b3e55e31f60c95b76ff984a33f02e6757cdd45b1b5b067f706902740: Status 404 returned error can't find the container with id 2927e129b3e55e31f60c95b76ff984a33f02e6757cdd45b1b5b067f706902740 Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.965409 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928ba203_a815_4c6d_9097_e1eafd194ab0.slice/crio-7198ce78ee60c4bda14779a22d1f2da2a3d31cda9d9cc519ae4f9cf92cda77f4 WatchSource:0}: Error finding container 7198ce78ee60c4bda14779a22d1f2da2a3d31cda9d9cc519ae4f9cf92cda77f4: Status 404 returned error can't find the container with id 7198ce78ee60c4bda14779a22d1f2da2a3d31cda9d9cc519ae4f9cf92cda77f4 Mar 20 12:29:09 crc kubenswrapper[4817]: W0320 12:29:09.968983 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda4ab27a_0064_4501_9475_4eecdd9ffbcc.slice/crio-19bdf69f9e47575ab7eee629bac778d4706827b572ea1f60e96177e8e11e3d63 WatchSource:0}: Error finding container 19bdf69f9e47575ab7eee629bac778d4706827b572ea1f60e96177e8e11e3d63: Status 404 returned error can't find the container with id 19bdf69f9e47575ab7eee629bac778d4706827b572ea1f60e96177e8e11e3d63 Mar 20 12:29:10 crc kubenswrapper[4817]: W0320 12:29:10.019213 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod479d3864_1c15_4044_84b2_376cc2b603b6.slice/crio-b234a1bfd1623f62af78c1748ac4bc01c2557e82eeb39d997e84954da44b0997 WatchSource:0}: Error finding container b234a1bfd1623f62af78c1748ac4bc01c2557e82eeb39d997e84954da44b0997: Status 404 returned error can't find the container with id b234a1bfd1623f62af78c1748ac4bc01c2557e82eeb39d997e84954da44b0997 Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022334 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022597 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0f4ee62-4cd8-4085-855e-9304b9ce5018-metrics-tls\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022678 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-csi-data-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022717 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvq4\" (UniqueName: \"kubernetes.io/projected/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-kube-api-access-4fvq4\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022744 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f4ee62-4cd8-4085-855e-9304b9ce5018-config-volume\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022758 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9hf\" (UniqueName: \"kubernetes.io/projected/b0f4ee62-4cd8-4085-855e-9304b9ce5018-kube-api-access-7c9hf\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022803 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022876 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022910 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-cabundle\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.022948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-registration-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023025 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565xs\" (UniqueName: \"kubernetes.io/projected/70b7229d-a4be-4ee2-9b94-d7bc011986d7-kube-api-access-565xs\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023161 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023221 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023238 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023278 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-socket-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023436 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bst9l\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023484 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j586r\" (UniqueName: \"kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023504 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7849m\" (UniqueName: \"kubernetes.io/projected/122a38f4-116c-41d1-926d-7c1c3d6ba167-kube-api-access-7849m\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.023644 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.023714 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.52369333 +0000 UTC m=+112.612006123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024023 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-cert\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024137 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-key\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024166 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-mountpoint-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024187 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-plugins-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024347 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024402 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.024432 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.025914 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.026912 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.027103 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.027156 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-cabundle\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.029934 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.030242 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.034526 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.036246 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.536229559 +0000 UTC m=+112.624542332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.041546 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.041568 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/122a38f4-116c-41d1-926d-7c1c3d6ba167-signing-key\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.047881 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.054956 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.055303 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.078889 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bst9l\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.118497 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j586r\" (UniqueName: \"kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r\") pod \"cni-sysctl-allowlist-ds-rwqck\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.127986 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128298 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-mountpoint-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128320 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-plugins-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128354 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0f4ee62-4cd8-4085-855e-9304b9ce5018-metrics-tls\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128374 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-csi-data-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128390 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvq4\" (UniqueName: \"kubernetes.io/projected/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-kube-api-access-4fvq4\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128403 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f4ee62-4cd8-4085-855e-9304b9ce5018-config-volume\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128417 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9hf\" (UniqueName: \"kubernetes.io/projected/b0f4ee62-4cd8-4085-855e-9304b9ce5018-kube-api-access-7c9hf\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128482 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-registration-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565xs\" (UniqueName: \"kubernetes.io/projected/70b7229d-a4be-4ee2-9b94-d7bc011986d7-kube-api-access-565xs\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128542 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-socket-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.128603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-cert\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.132790 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.133258 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.633209532 +0000 UTC m=+112.721522315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.133302 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-mountpoint-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.133499 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-cert\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.133534 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-plugins-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.134704 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f4ee62-4cd8-4085-855e-9304b9ce5018-config-volume\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.134984 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-registration-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.135067 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-socket-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.135142 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/70b7229d-a4be-4ee2-9b94-d7bc011986d7-csi-data-dir\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.136342 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7849m\" (UniqueName: \"kubernetes.io/projected/122a38f4-116c-41d1-926d-7c1c3d6ba167-kube-api-access-7849m\") pod \"service-ca-9c57cc56f-6pc55\" (UID: \"122a38f4-116c-41d1-926d-7c1c3d6ba167\") " pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.175589 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0f4ee62-4cd8-4085-855e-9304b9ce5018-metrics-tls\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.178227 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvq4\" (UniqueName: \"kubernetes.io/projected/a60d8f02-3c8e-43ef-8fd3-0aca152cfb12-kube-api-access-4fvq4\") pod \"ingress-canary-zb66q\" (UID: \"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12\") " pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.184992 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9hf\" (UniqueName: \"kubernetes.io/projected/b0f4ee62-4cd8-4085-855e-9304b9ce5018-kube-api-access-7c9hf\") pod \"dns-default-76blx\" (UID: \"b0f4ee62-4cd8-4085-855e-9304b9ce5018\") " pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.223597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565xs\" (UniqueName: \"kubernetes.io/projected/70b7229d-a4be-4ee2-9b94-d7bc011986d7-kube-api-access-565xs\") pod \"csi-hostpathplugin-55j94\" (UID: \"70b7229d-a4be-4ee2-9b94-d7bc011986d7\") " pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.237587 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.240813 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.740799491 +0000 UTC m=+112.829112274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.258706 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8h755" event={"ID":"b64a28cb-8f75-48c6-8980-8a1003ffba98","Type":"ContainerStarted","Data":"888044d7fd233d4e988360ff94939ec36d83e1064a526816d2cfaa7bb7310386"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.258750 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8h755" event={"ID":"b64a28cb-8f75-48c6-8980-8a1003ffba98","Type":"ContainerStarted","Data":"7bf9b3d9b4ed8b2ad5d5d916fd18e832eab6d567ca43275261df0154c35c446f"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.276824 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" event={"ID":"a2493ed0-295f-4eba-8870-3f5716a76ca6","Type":"ContainerStarted","Data":"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.276862 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" event={"ID":"a2493ed0-295f-4eba-8870-3f5716a76ca6","Type":"ContainerStarted","Data":"33c897ec87113e97c15fdf27b6d5aa080866f955b2b97e25a34ade2963df6642"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.277183 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.282482 4817 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v6wkw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.282534 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.286667 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" event={"ID":"2713a227-3462-4d3d-86ff-6cb0101ef6be","Type":"ContainerStarted","Data":"2927e129b3e55e31f60c95b76ff984a33f02e6757cdd45b1b5b067f706902740"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.313484 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" event={"ID":"05c7c173-9fcc-415c-a13e-9290bc4e5735","Type":"ContainerStarted","Data":"bb0d234df806eaca8a2af2a787b108d42d6457d398aa6c86efa348c9f8d40a4e"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.313561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" event={"ID":"05c7c173-9fcc-415c-a13e-9290bc4e5735","Type":"ContainerStarted","Data":"47c03b46428310138ec1ad8b4d20eab6850d239accc4363e804b916fc37f61ec"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.322795 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j7kqr" event={"ID":"b5314572-4537-46f6-a0f3-08112dd1d556","Type":"ContainerStarted","Data":"a20cbb31eab336bd192c798c52718be71d80d479b43b3dd17850d142dce0f817"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.325623 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" event={"ID":"479d3864-1c15-4044-84b2-376cc2b603b6","Type":"ContainerStarted","Data":"b234a1bfd1623f62af78c1748ac4bc01c2557e82eeb39d997e84954da44b0997"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.326741 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.326781 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" event={"ID":"f5457083-eb9e-4828-839c-a7613592278e","Type":"ContainerStarted","Data":"aca45b8bff85d9946fb90ceee630dd670d8881026cf8647befd776e32d866e6a"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.331866 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nb552" event={"ID":"9148329e-b499-499d-a1a6-b1ff1368de6c","Type":"ContainerStarted","Data":"5c21789109b9927f2745a7b4b19ffd8e9beb22ce284386d3b4d38f0023d3ee60"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.331917 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nb552" event={"ID":"9148329e-b499-499d-a1a6-b1ff1368de6c","Type":"ContainerStarted","Data":"9060c7b389efc2db832c86c1603dafc30157ead60ea560795f4a15586c57a79e"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.334960 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mzrc4" event={"ID":"9351978b-90a5-48f6-ba2b-68e2c4f2c574","Type":"ContainerStarted","Data":"e587b99b869534645c26b1cf09d6767bbbb12aec40c925bccd8487b37e261d71"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.335007 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mzrc4" event={"ID":"9351978b-90a5-48f6-ba2b-68e2c4f2c574","Type":"ContainerStarted","Data":"e0e7ae051ccbb5b2a6d9f6949eee0238411c0d608d656e28e07d4d05b073fbfa"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.335909 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" event={"ID":"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe","Type":"ContainerStarted","Data":"3ffc0934aba7dc625fe0b0d5b5fbbf95cd9585ce985744b3b31f85e99b424457"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.337027 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2jblp" event={"ID":"928ba203-a815-4c6d-9097-e1eafd194ab0","Type":"ContainerStarted","Data":"7198ce78ee60c4bda14779a22d1f2da2a3d31cda9d9cc519ae4f9cf92cda77f4"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.337974 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" event={"ID":"e845d329-b1ce-48a9-8088-cbb4aabe49e4","Type":"ContainerStarted","Data":"94045410ad7c90d66158704609ad62553e07461d3125fadbe6a89bfca1782f3a"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.338036 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.338098 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.338321 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.838302468 +0000 UTC m=+112.926615251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.338412 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.338788 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.838780542 +0000 UTC m=+112.927093325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.339663 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" event={"ID":"df4eb359-67fa-4b25-b909-a21b16b5be2d","Type":"ContainerStarted","Data":"5704db1325eed754bcd21584842e55afe9c72c23343ac76d48950de75ee7108b"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.339709 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" event={"ID":"df4eb359-67fa-4b25-b909-a21b16b5be2d","Type":"ContainerStarted","Data":"397915cbb6f0298e2fa8cb72b72a4a58535afe7047beaf2d6b490e074da214ab"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.342606 4817 generic.go:334] "Generic (PLEG): container finished" podID="b3057700-8d5d-4e2e-8e0c-66490bfe55c5" containerID="eb80d9bfe1b5c6c1c10e6bb360eeacb70ed97020e635c1cbbabed2d2a41c72bc" exitCode=0 Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.342835 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" event={"ID":"b3057700-8d5d-4e2e-8e0c-66490bfe55c5","Type":"ContainerDied","Data":"eb80d9bfe1b5c6c1c10e6bb360eeacb70ed97020e635c1cbbabed2d2a41c72bc"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.342873 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" event={"ID":"b3057700-8d5d-4e2e-8e0c-66490bfe55c5","Type":"ContainerStarted","Data":"c21a79e6e4a9b23657d412f5e5f8d3fca4009223c7e57d13df58c1bd0a5d7672"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.345398 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" event={"ID":"df2df950-540c-408e-a555-81b7e7da9e26","Type":"ContainerStarted","Data":"c3f13b771077ca9e983dc21573c89eccf2f33e9a49c82d884223486ae67ff094"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.346403 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" event={"ID":"1efe6fbf-e6da-4413-ab32-6c457572e894","Type":"ContainerStarted","Data":"6fc2dbdc898bf9931bc5f3f2a4bb8ba3232a0efcb69eaaf18d7ab646069dc926"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.347641 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" event={"ID":"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb","Type":"ContainerStarted","Data":"595f7e14888491bd2a58b8c600d08b3347a6c6aa6a466c58e4658ce4bdcffca9"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.347662 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" event={"ID":"e819d4db-e0d9-4bc4-afe2-7cea81b1d0fb","Type":"ContainerStarted","Data":"8508007d6ff89fd238010caebf472ecefad96f290fc7fba2fcb06a9443ed30d9"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.350101 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" event={"ID":"a9414873-8b3d-4dfb-94a2-604382a77729","Type":"ContainerStarted","Data":"5af732b8be2cc84153f47138a6d6d9e287ef82a3dc71e226c87a97b5d090973e"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.352007 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" event={"ID":"4e26a2f6-3ad6-4113-8120-fc0af5b772c6","Type":"ContainerStarted","Data":"e4429a4ef29c1eb65458aa879bab4ef9cda5455e38b14f2a073df2fc3df44eb1"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.355536 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" event={"ID":"9f644c83-90e3-4eb1-80b0-82781b255d15","Type":"ContainerStarted","Data":"70f43f32defc76e16cd53ad976a92dd04f57b7e8d74a4ae5bd540b5ee275c968"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.357161 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" event={"ID":"da4ab27a-0064-4501-9475-4eecdd9ffbcc","Type":"ContainerStarted","Data":"19bdf69f9e47575ab7eee629bac778d4706827b572ea1f60e96177e8e11e3d63"} Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.362309 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.400466 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.406408 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zb66q" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.420812 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.437809 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-55j94" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.437856 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.438711 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.439541 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.440832 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:10.940817106 +0000 UTC m=+113.029129889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.448717 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76blx" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.541399 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.541754 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.041738639 +0000 UTC m=+113.130051422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.629711 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.646111 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.646557 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.146532929 +0000 UTC m=+113.234845702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.754109 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.756945 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.256929096 +0000 UTC m=+113.345241879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.793351 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.794760 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.819449 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.820951 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.831802 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.858713 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.859173 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.359152035 +0000 UTC m=+113.447464819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.859431 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.859744 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.359730842 +0000 UTC m=+113.448043625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.891529 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.907170 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.907563 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.946479 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2jnf2" podStartSLOduration=72.946457129 podStartE2EDuration="1m12.946457129s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:10.932035727 +0000 UTC m=+113.020348510" watchObservedRunningTime="2026-03-20 12:29:10.946457129 +0000 UTC m=+113.034769912" Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.950450 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9lk4"] Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.960379 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:10 crc kubenswrapper[4817]: E0320 12:29:10.963923 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.460700366 +0000 UTC m=+113.549013159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:10 crc kubenswrapper[4817]: I0320 12:29:10.976676 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6"] Mar 20 12:29:10 crc kubenswrapper[4817]: W0320 12:29:10.997072 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3942ee53_6987_42c9_85ed_e5b799a1555d.slice/crio-b052815c95b4b40238cf807e2fa87faaf2d85cab9155dd64534afb37517887ff WatchSource:0}: Error finding container b052815c95b4b40238cf807e2fa87faaf2d85cab9155dd64534afb37517887ff: Status 404 returned error can't find the container with id b052815c95b4b40238cf807e2fa87faaf2d85cab9155dd64534afb37517887ff Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.062043 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.062436 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.56242149 +0000 UTC m=+113.650734273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.168879 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.169322 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.669293669 +0000 UTC m=+113.757606452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.169595 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.170187 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.670174013 +0000 UTC m=+113.758486796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.172391 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" podStartSLOduration=74.172351564 podStartE2EDuration="1m14.172351564s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:11.170664107 +0000 UTC m=+113.258976880" watchObservedRunningTime="2026-03-20 12:29:11.172351564 +0000 UTC m=+113.260664347" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.180050 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb"] Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.251375 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zb66q"] Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.271060 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.272031 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.772002251 +0000 UTC m=+113.860315034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.273657 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.299100 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.799084956 +0000 UTC m=+113.887397739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.378647 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.379013 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.878991143 +0000 UTC m=+113.967303926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.379152 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.379446 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.879432916 +0000 UTC m=+113.967745699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.400721 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76blx"] Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.421917 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" event={"ID":"3942ee53-6987-42c9-85ed-e5b799a1555d","Type":"ContainerStarted","Data":"b052815c95b4b40238cf807e2fa87faaf2d85cab9155dd64534afb37517887ff"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.424226 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:11 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:11 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:11 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.424259 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.429079 4817 generic.go:334] "Generic (PLEG): container finished" podID="b432f9f8-8e62-4ba9-b852-3e670ea6cdbe" containerID="3c9d68f65bafee4f7449c54fcc13af89d96e5742a4cbaebb7aafd6a1b5562b1c" exitCode=0 Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.429215 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" event={"ID":"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe","Type":"ContainerDied","Data":"3c9d68f65bafee4f7449c54fcc13af89d96e5742a4cbaebb7aafd6a1b5562b1c"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.443586 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bgdpq" podStartSLOduration=74.443567833 podStartE2EDuration="1m14.443567833s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:11.441807914 +0000 UTC m=+113.530120737" watchObservedRunningTime="2026-03-20 12:29:11.443567833 +0000 UTC m=+113.531880616" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.443729 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" event={"ID":"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43","Type":"ContainerStarted","Data":"e26099481bd9fd4fc5449b5c39216dfd85f74bccc0ff69acd013ccf466b8141b"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.452189 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6pc55"] Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.476547 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" event={"ID":"17ba5d71-5532-4d72-9505-9bab643bbd40","Type":"ContainerStarted","Data":"057b52d65f5d5437096c8a5645aa55db20270090fcf917a3e6e73cb6679715ff"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.477149 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-55j94"] Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.480562 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.480783 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" event={"ID":"4d926e43-e19a-460f-8e87-1fe72e62d352","Type":"ContainerStarted","Data":"d3034b987c63b19823c9011c2fc43896cf70eb054d274b0efd4c577f3316220c"} Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.481090 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:11.981075029 +0000 UTC m=+114.069387812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.481593 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.483432 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" event={"ID":"6bdeca1d-6768-4d19-a080-885b98c47f5b","Type":"ContainerStarted","Data":"11b1cb8f76e1719a48011ddd70793d0376838ec3d2032a3bdf17c4e2bb142c54"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.483521 4817 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bqnjh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.483547 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.489859 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" event={"ID":"a9414873-8b3d-4dfb-94a2-604382a77729","Type":"ContainerStarted","Data":"80aab6db89ec702c1ea182252b998472dec0307d8b07c9b1a372a095ebcf7a2d"} Mar 20 12:29:11 crc kubenswrapper[4817]: W0320 12:29:11.503891 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1907dc62_9c82_492c_a7b9_642410848e1f.slice/crio-b68be7d125e6a528d7ee51f2fba0507b96a819f1eaaa171b4d3fcc718a47f9ae WatchSource:0}: Error finding container b68be7d125e6a528d7ee51f2fba0507b96a819f1eaaa171b4d3fcc718a47f9ae: Status 404 returned error can't find the container with id b68be7d125e6a528d7ee51f2fba0507b96a819f1eaaa171b4d3fcc718a47f9ae Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.506844 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" event={"ID":"df2df950-540c-408e-a555-81b7e7da9e26","Type":"ContainerStarted","Data":"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.507635 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.511656 4817 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jsgjh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.511706 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" podUID="df2df950-540c-408e-a555-81b7e7da9e26" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.529081 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33738: no serving certificate available for the kubelet" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.529374 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" event={"ID":"413aaaf0-f437-4a19-845a-6b5c1bfccd07","Type":"ContainerStarted","Data":"f8c293b4df7f9397cf7b78511b180951aa2d5904dbd5ce5e442010cba47bca4e"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.533130 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" event={"ID":"85d5ecf6-da8d-4953-9b78-7ba019986d37","Type":"ContainerStarted","Data":"cc8ac1ed8b39322e06b1e8299e06492e2fd2de91fa75c4daf06f597426350428"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.537304 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2jblp" event={"ID":"928ba203-a815-4c6d-9097-e1eafd194ab0","Type":"ContainerStarted","Data":"4488e450bf39f4cb54738843c08039dcdb68790cefc00f402df80139dd793832"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.537669 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.538747 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" event={"ID":"68eb3b35-c763-4024-8fd0-6dc63ea80eb8","Type":"ContainerStarted","Data":"30750cd219a84eb1d4fb7f2441052f585be02197690180dad61dcf93525105ee"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.545738 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.545784 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.547099 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" event={"ID":"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f","Type":"ContainerStarted","Data":"2696ac76ba9e4e36250aac2b2464370c0ba4db6cd66d732045ecbf369caa6bee"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.562413 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" event={"ID":"da4ab27a-0064-4501-9475-4eecdd9ffbcc","Type":"ContainerStarted","Data":"2b55a088ff201706dfd09337c2cecfbee532b56d225ea8a6192ae20e9131c2f3"} Mar 20 12:29:11 crc kubenswrapper[4817]: W0320 12:29:11.564251 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f4ee62_4cd8_4085_855e_9304b9ce5018.slice/crio-ffc3ed71a7c9cfa3c95587aa5b2d4a115acff56d3be79381f539926233023f16 WatchSource:0}: Error finding container ffc3ed71a7c9cfa3c95587aa5b2d4a115acff56d3be79381f539926233023f16: Status 404 returned error can't find the container with id ffc3ed71a7c9cfa3c95587aa5b2d4a115acff56d3be79381f539926233023f16 Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.566288 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" event={"ID":"1efe6fbf-e6da-4413-ab32-6c457572e894","Type":"ContainerStarted","Data":"b9a633c245b52863fb303abf9245492e5fdb47d2871ab71655f3e2fbbdc9d3d5"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.575704 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" event={"ID":"479d3864-1c15-4044-84b2-376cc2b603b6","Type":"ContainerStarted","Data":"dcf6bd1ec42d13d5c56cb2ed438b33cdb90cf2f19773c1f7789a7bfc2fff88e9"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.576218 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.581574 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.581714 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.582764 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j7kqr" event={"ID":"b5314572-4537-46f6-a0f3-08112dd1d556","Type":"ContainerStarted","Data":"58800de1765982c716a32c98b984cb794c9e80c79140cbb41cfd27d34d093c51"} Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.583683 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.083667788 +0000 UTC m=+114.171980571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.588092 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" event={"ID":"f5457083-eb9e-4828-839c-a7613592278e","Type":"ContainerStarted","Data":"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.588632 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.590458 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9cb3896-c4ff-4ccb-b494-eac8b4460342-metrics-certs\") pod \"network-metrics-daemon-xq7wp\" (UID: \"c9cb3896-c4ff-4ccb-b494-eac8b4460342\") " pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.598675 4817 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vt6cg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.598724 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" podUID="f5457083-eb9e-4828-839c-a7613592278e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.598679 4817 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-tw2rl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.598778 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" podUID="479d3864-1c15-4044-84b2-376cc2b603b6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.600816 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" event={"ID":"b6ef9afb-80f9-48dd-b41d-47874fcf3be9","Type":"ContainerStarted","Data":"7b0ab47123ca151fbe78250cf81d4223b6dc1f9cc68dfc4bc393e2a9ea9fb198"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.603808 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" event={"ID":"344f693a-912d-41fe-a9f8-c344e2770d08","Type":"ContainerStarted","Data":"0829986ca79d6a88135d8242c16c424ce0fa6cb80370f1b986adb924c61f7e36"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.611095 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" event={"ID":"00a46bd9-a747-4587-9333-f10f27e9be52","Type":"ContainerStarted","Data":"6ca060d7c820c998fc3bfceaf95bfaccd4f696633eeebfba07dfcf793bea181a"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.615683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" event={"ID":"e845d329-b1ce-48a9-8088-cbb4aabe49e4","Type":"ContainerStarted","Data":"393bd87e812c2d5d159837d5b25699522c17a9ac6fc17f9ccfb8a7463874ab08"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.621178 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" event={"ID":"4e26a2f6-3ad6-4113-8120-fc0af5b772c6","Type":"ContainerStarted","Data":"92b93d704a116d5d13f1ec586253c49e93ed67ea3e2ab7623ebd976b20801afb"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.621852 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33748: no serving certificate available for the kubelet" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.625593 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" event={"ID":"98722637-bcd0-4008-ad86-2a5a7e129b34","Type":"ContainerStarted","Data":"6b1671c8130008f762efbfd029b50399ee4014f79794f3f15a3f23aa828b398d"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.634758 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" event={"ID":"b89aea9e-c134-42e3-b366-067d746cf7d7","Type":"ContainerStarted","Data":"3845d049257f3906d571aa07edff67f88af77bfcfb66599de85d3bc561800ac2"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.638485 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" event={"ID":"d0f2d38b-934b-4185-97ab-f43bfcbad479","Type":"ContainerStarted","Data":"6e2e761afc0316733ed12a75b91355899d8ebc54d8edecc23bb099c7688abcd6"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.648002 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.653316 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r5qk5" podStartSLOduration=74.653300469 podStartE2EDuration="1m14.653300469s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:11.6486848 +0000 UTC m=+113.736997583" watchObservedRunningTime="2026-03-20 12:29:11.653300469 +0000 UTC m=+113.741613252" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.683437 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.684585 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.18455477 +0000 UTC m=+114.272867553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.700215 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.700862 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.706066 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" event={"ID":"d152b635-3488-428c-b8bd-b28e7fe13bef","Type":"ContainerStarted","Data":"255e7ab29f6ab0a2764b0df3f344a9ee5a1ccc2251475bdd19d403787fd62780"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.712368 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" event={"ID":"05c7c173-9fcc-415c-a13e-9290bc4e5735","Type":"ContainerStarted","Data":"80ec1f72d534a20db610ae936f1c9ff7efbbf3efcd47ce303d4f4639518eab13"} Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.715358 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.721178 4817 patch_prober.go:28] interesting pod/console-operator-58897d9998-8h755 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.721228 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8h755" podUID="b64a28cb-8f75-48c6-8980-8a1003ffba98" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.738157 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33752: no serving certificate available for the kubelet" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.781254 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mzrc4" podStartSLOduration=74.781219894 podStartE2EDuration="1m14.781219894s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:11.768933472 +0000 UTC m=+113.857246255" watchObservedRunningTime="2026-03-20 12:29:11.781219894 +0000 UTC m=+113.869532678" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.787307 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.789027 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.289006721 +0000 UTC m=+114.377319584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.803806 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nb552" podStartSLOduration=73.803786954 podStartE2EDuration="1m13.803786954s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:11.802388615 +0000 UTC m=+113.890701398" watchObservedRunningTime="2026-03-20 12:29:11.803786954 +0000 UTC m=+113.892099737" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.827552 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33756: no serving certificate available for the kubelet" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.883192 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xq7wp" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.888694 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.890682 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.390631014 +0000 UTC m=+114.478943807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.928005 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33772: no serving certificate available for the kubelet" Mar 20 12:29:11 crc kubenswrapper[4817]: I0320 12:29:11.991510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:11 crc kubenswrapper[4817]: E0320 12:29:11.992114 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.492094592 +0000 UTC m=+114.580407375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.037855 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" podStartSLOduration=74.037820666 podStartE2EDuration="1m14.037820666s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.037170278 +0000 UTC m=+114.125483061" watchObservedRunningTime="2026-03-20 12:29:12.037820666 +0000 UTC m=+114.126133459" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.038770 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" podStartSLOduration=74.038760483 podStartE2EDuration="1m14.038760483s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.006170894 +0000 UTC m=+114.094483687" watchObservedRunningTime="2026-03-20 12:29:12.038760483 +0000 UTC m=+114.127073266" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.039388 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33784: no serving certificate available for the kubelet" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.087047 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2jblp" podStartSLOduration=75.087024508 podStartE2EDuration="1m15.087024508s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.059579313 +0000 UTC m=+114.147892096" watchObservedRunningTime="2026-03-20 12:29:12.087024508 +0000 UTC m=+114.175337301" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.094832 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.095405 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.595374341 +0000 UTC m=+114.683687124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.097569 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.174460 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33796: no serving certificate available for the kubelet" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.197004 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.197416 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.697399254 +0000 UTC m=+114.785712037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.232997 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6lwrj" podStartSLOduration=75.232960315 podStartE2EDuration="1m15.232960315s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.221222088 +0000 UTC m=+114.309534871" watchObservedRunningTime="2026-03-20 12:29:12.232960315 +0000 UTC m=+114.321273098" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.297632 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.298434 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.79841808 +0000 UTC m=+114.886730863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.306148 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33806: no serving certificate available for the kubelet" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.341483 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" podStartSLOduration=74.341458569 podStartE2EDuration="1m14.341458569s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.340726039 +0000 UTC m=+114.429038822" watchObservedRunningTime="2026-03-20 12:29:12.341458569 +0000 UTC m=+114.429771362" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.399330 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.399810 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:12.899793965 +0000 UTC m=+114.988106748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.433113 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" podStartSLOduration=74.433093884 podStartE2EDuration="1m14.433093884s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.377516314 +0000 UTC m=+114.465829087" watchObservedRunningTime="2026-03-20 12:29:12.433093884 +0000 UTC m=+114.521406667" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.438176 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:12 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:12 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:12 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.453309 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.477814 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hnqtm" podStartSLOduration=74.477774129 podStartE2EDuration="1m14.477774129s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.465467386 +0000 UTC m=+114.553780169" watchObservedRunningTime="2026-03-20 12:29:12.477774129 +0000 UTC m=+114.566086912" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.528092 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.528973 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.028952835 +0000 UTC m=+115.117265618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.532653 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" podStartSLOduration=74.532633258 podStartE2EDuration="1m14.532633258s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.527380191 +0000 UTC m=+114.615692974" watchObservedRunningTime="2026-03-20 12:29:12.532633258 +0000 UTC m=+114.620946041" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.600668 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8h755" podStartSLOduration=75.600552661 podStartE2EDuration="1m15.600552661s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.586313194 +0000 UTC m=+114.674625977" watchObservedRunningTime="2026-03-20 12:29:12.600552661 +0000 UTC m=+114.688865444" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.629635 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.630213 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.130199827 +0000 UTC m=+115.218512610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.676292 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cwss" podStartSLOduration=75.676266241 podStartE2EDuration="1m15.676266241s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.622176254 +0000 UTC m=+114.710489057" watchObservedRunningTime="2026-03-20 12:29:12.676266241 +0000 UTC m=+114.764579024" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.677023 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-n45lz" podStartSLOduration=75.677017462 podStartE2EDuration="1m15.677017462s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.652715295 +0000 UTC m=+114.741028078" watchObservedRunningTime="2026-03-20 12:29:12.677017462 +0000 UTC m=+114.765330245" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.732542 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.733089 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.233070124 +0000 UTC m=+115.321382907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.743629 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" podStartSLOduration=74.743600768 podStartE2EDuration="1m14.743600768s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.733698812 +0000 UTC m=+114.822011595" watchObservedRunningTime="2026-03-20 12:29:12.743600768 +0000 UTC m=+114.831913551" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.764436 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.764487 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xq7wp"] Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.815435 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-j7kqr" podStartSLOduration=6.815406319 podStartE2EDuration="6.815406319s" podCreationTimestamp="2026-03-20 12:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.783564862 +0000 UTC m=+114.871877645" watchObservedRunningTime="2026-03-20 12:29:12.815406319 +0000 UTC m=+114.903719102" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.816472 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.825617 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.825595063 podStartE2EDuration="18.825595063s" podCreationTimestamp="2026-03-20 12:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.823450394 +0000 UTC m=+114.911763177" watchObservedRunningTime="2026-03-20 12:29:12.825595063 +0000 UTC m=+114.913907846" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.833778 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.834615 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.334600524 +0000 UTC m=+115.422913307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.857578 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76blx" event={"ID":"b0f4ee62-4cd8-4085-855e-9304b9ce5018","Type":"ContainerStarted","Data":"ffc3ed71a7c9cfa3c95587aa5b2d4a115acff56d3be79381f539926233023f16"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.875530 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" event={"ID":"1efe6fbf-e6da-4413-ab32-6c457572e894","Type":"ContainerStarted","Data":"274c3be6d435f3d8a987fcb22259aaa11cc6777d6b669ce7be9db5d547dd5caa"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.891078 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" event={"ID":"4d926e43-e19a-460f-8e87-1fe72e62d352","Type":"ContainerStarted","Data":"d7fbc95e04c0ebb57881cd89ac7e44da142ea149ef0bff4b240efbfce2e0b62d"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.899311 4817 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bqnjh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.899386 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.934843 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:12 crc kubenswrapper[4817]: E0320 12:29:12.935224 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.435208049 +0000 UTC m=+115.523520832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.950429 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mn5j" event={"ID":"98722637-bcd0-4008-ad86-2a5a7e129b34","Type":"ContainerStarted","Data":"66ef37f75b3c1cf284a8cfb5e51e8d4250af0907b43b27ca7dfc9bf840e56bb8"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.967477 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" event={"ID":"b89aea9e-c134-42e3-b366-067d746cf7d7","Type":"ContainerStarted","Data":"ae85f879b1915080913173f793e26f9c4291c34793b4e71ab58b1cf5ed2784ee"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.984689 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" event={"ID":"17ba5d71-5532-4d72-9505-9bab643bbd40","Type":"ContainerStarted","Data":"3767fbf1a281b9a41f9bd81cef14cece29e99424ddc401443c72f3d6540c1fd7"} Mar 20 12:29:12 crc kubenswrapper[4817]: I0320 12:29:12.990924 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" event={"ID":"b6ef9afb-80f9-48dd-b41d-47874fcf3be9","Type":"ContainerStarted","Data":"787f0d850343330ae3e554b7ff9fa4cbf93eb832f1871de3208bb6e0fbd037eb"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.011296 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2z24k" podStartSLOduration=75.011277459 podStartE2EDuration="1m15.011277459s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.009606782 +0000 UTC m=+115.097919565" watchObservedRunningTime="2026-03-20 12:29:13.011277459 +0000 UTC m=+115.099590232" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.011808 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cw976" podStartSLOduration=75.011801483 podStartE2EDuration="1m15.011801483s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:12.942418799 +0000 UTC m=+115.030731582" watchObservedRunningTime="2026-03-20 12:29:13.011801483 +0000 UTC m=+115.100114266" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.030471 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-55j94" event={"ID":"70b7229d-a4be-4ee2-9b94-d7bc011986d7","Type":"ContainerStarted","Data":"cdacb3a27b2fc4dbdfad627edd6300df2be2b67c9c78b79cf6705961feacdba8"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.037301 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.040842 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.540826112 +0000 UTC m=+115.629138895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.061910 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" event={"ID":"344f693a-912d-41fe-a9f8-c344e2770d08","Type":"ContainerStarted","Data":"8db4f9fa90d629314e11cc027a6e7024db7c1739198cde59a456adc30ed56611"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.062454 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.087137 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zb66q" event={"ID":"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12","Type":"ContainerStarted","Data":"9476d28c1dfecb2ae4539b9dc070bb66e79ecaf24aa9eaccb72cabc039dfb7cf"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.087187 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zb66q" event={"ID":"a60d8f02-3c8e-43ef-8fd3-0aca152cfb12","Type":"ContainerStarted","Data":"04c8d0e5cb40c04ec342abba88632d99464decdd05b0b2fb7f79a5dc7e6f1ce9"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.089878 4817 ???:1] "http: TLS handshake error from 192.168.126.11:33820: no serving certificate available for the kubelet" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.098443 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5rq7n" event={"ID":"00a46bd9-a747-4587-9333-f10f27e9be52","Type":"ContainerStarted","Data":"1e3712fa5641d5b384dd723d3ea9bd084e767783a10965d1134e2a7c55cfbeab"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.136980 4817 generic.go:334] "Generic (PLEG): container finished" podID="2713a227-3462-4d3d-86ff-6cb0101ef6be" containerID="20a4607074171811d91322dc8f9b2b36f84d0902b5bb5171266362af2b6866ca" exitCode=0 Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.137114 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" event={"ID":"2713a227-3462-4d3d-86ff-6cb0101ef6be","Type":"ContainerDied","Data":"20a4607074171811d91322dc8f9b2b36f84d0902b5bb5171266362af2b6866ca"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.142114 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.142279 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.642250909 +0000 UTC m=+115.730563682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.143992 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.145660 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.645645444 +0000 UTC m=+115.733958227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.170822 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" podStartSLOduration=76.170806745 podStartE2EDuration="1m16.170806745s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.16776099 +0000 UTC m=+115.256073773" watchObservedRunningTime="2026-03-20 12:29:13.170806745 +0000 UTC m=+115.259119528" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.176820 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" event={"ID":"b432f9f8-8e62-4ba9-b852-3e670ea6cdbe","Type":"ContainerStarted","Data":"eb0a15d6262948eb99b2f67704307b4c99d9c971961ff6a83a67e857563bf91a"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.177545 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.203761 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" event={"ID":"85d5ecf6-da8d-4953-9b78-7ba019986d37","Type":"ContainerStarted","Data":"4921ded9891bf741d9bfd22a2f223ba03d577355955cafba805674ae1285d1c8"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.203806 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" event={"ID":"85d5ecf6-da8d-4953-9b78-7ba019986d37","Type":"ContainerStarted","Data":"7c0e22e09d48cea4acca43e07e9369deb54571b284566759c54f595fdd75599c"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.228796 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" event={"ID":"68eb3b35-c763-4024-8fd0-6dc63ea80eb8","Type":"ContainerStarted","Data":"e00fe54fefd8cf5191b8c12c0c7d552d0356f936ffc7e0351b2679c0a8d6da54"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.250958 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.253406 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.753355626 +0000 UTC m=+115.841668419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.257836 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" event={"ID":"6bdeca1d-6768-4d19-a080-885b98c47f5b","Type":"ContainerStarted","Data":"59c638c587fc8cf82b8e212a03180981caab4dbd5e91ccbc9d185d94d1c0981d"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.260334 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.262539 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" event={"ID":"122a38f4-116c-41d1-926d-7c1c3d6ba167","Type":"ContainerStarted","Data":"2306ca2ee6ce0e6eed175a9dc3aceb5761b68095d1c4c8c9d6f79adb8cbd0d78"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.264485 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" event={"ID":"3942ee53-6987-42c9-85ed-e5b799a1555d","Type":"ContainerStarted","Data":"109aadb0925a07daa69ea426f906592d3aadb31bb087a1cf4bb32c18faad46a4"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.284200 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" event={"ID":"b3057700-8d5d-4e2e-8e0c-66490bfe55c5","Type":"ContainerStarted","Data":"6825d5befda625bb232d76ee1daa3ea19e64a4ed3a95f41a5075f41c11736fb0"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.304423 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" event={"ID":"da4ab27a-0064-4501-9475-4eecdd9ffbcc","Type":"ContainerStarted","Data":"8403d95c8be7fecd01f9a9a60b2c8b77ed67eba4425218d8dcb4a0d4cc5cbe45"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.329591 4817 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zqvg6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.329651 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" podUID="6bdeca1d-6768-4d19-a080-885b98c47f5b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.352542 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.354023 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.854011431 +0000 UTC m=+115.942324214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.393189 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" podStartSLOduration=76.393176143 podStartE2EDuration="1m16.393176143s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.392070012 +0000 UTC m=+115.480382795" watchObservedRunningTime="2026-03-20 12:29:13.393176143 +0000 UTC m=+115.481488926" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.393781 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tzxvg" podStartSLOduration=75.39377636 podStartE2EDuration="1m15.39377636s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.253355226 +0000 UTC m=+115.341668009" watchObservedRunningTime="2026-03-20 12:29:13.39377636 +0000 UTC m=+115.482089133" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.410348 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" event={"ID":"d0f2d38b-934b-4185-97ab-f43bfcbad479","Type":"ContainerStarted","Data":"aef9b0d0be54fac74946def0fc59123c98d59f7c29968fc6074db5ea16464616"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.435305 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:13 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:13 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:13 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.435358 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.436289 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" event={"ID":"413aaaf0-f437-4a19-845a-6b5c1bfccd07","Type":"ContainerStarted","Data":"6a8cfdb7ae4331e50cbaa244275f4c02d1ce25633c5738cdf6ccead6bc0afee3"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.437949 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" event={"ID":"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f","Type":"ContainerStarted","Data":"2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.438346 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.454280 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.454536 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.954479712 +0000 UTC m=+116.042792495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.454780 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.455319 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:13.955304925 +0000 UTC m=+116.043617708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.457398 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" event={"ID":"1907dc62-9c82-492c-a7b9-642410848e1f","Type":"ContainerStarted","Data":"b68be7d125e6a528d7ee51f2fba0507b96a819f1eaaa171b4d3fcc718a47f9ae"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.457971 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.476286 4817 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cl2qb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.476344 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" podUID="1907dc62-9c82-492c-a7b9-642410848e1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.502315 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" event={"ID":"d152b635-3488-428c-b8bd-b28e7fe13bef","Type":"ContainerStarted","Data":"3f3ef58a76d30b252e8f7f48ac12014142593c22ff17c8e503a9575f66e5cb0c"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.534264 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" event={"ID":"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43","Type":"ContainerStarted","Data":"a721ed0e04a8f9f9d934712239a20456d79b273d4776bfe4c1581b46fadc48fb"} Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.537881 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7vdpg" podStartSLOduration=75.537860436 podStartE2EDuration="1m15.537860436s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.476919597 +0000 UTC m=+115.565232380" watchObservedRunningTime="2026-03-20 12:29:13.537860436 +0000 UTC m=+115.626173219" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.538264 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.538317 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.544994 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" podStartSLOduration=75.544964204 podStartE2EDuration="1m15.544964204s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.5387301 +0000 UTC m=+115.627042873" watchObservedRunningTime="2026-03-20 12:29:13.544964204 +0000 UTC m=+115.633276997" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.558488 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.558900 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tw2rl" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.559985 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.561343 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.06132803 +0000 UTC m=+116.149640813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.562677 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8h755" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.562726 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.578371 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wq2qb" podStartSLOduration=75.578354834 podStartE2EDuration="1m15.578354834s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.576155983 +0000 UTC m=+115.664468766" watchObservedRunningTime="2026-03-20 12:29:13.578354834 +0000 UTC m=+115.666667627" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.590019 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.661947 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.662473 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.162459588 +0000 UTC m=+116.250772371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.666024 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zb66q" podStartSLOduration=7.666007287 podStartE2EDuration="7.666007287s" podCreationTimestamp="2026-03-20 12:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.66037824 +0000 UTC m=+115.748691023" watchObservedRunningTime="2026-03-20 12:29:13.666007287 +0000 UTC m=+115.754320080" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.753458 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqwpw" podStartSLOduration=75.753441784 podStartE2EDuration="1m15.753441784s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.751693686 +0000 UTC m=+115.840006469" watchObservedRunningTime="2026-03-20 12:29:13.753441784 +0000 UTC m=+115.841754557" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.763539 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.764343 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.264313847 +0000 UTC m=+116.352626630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.849964 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" podStartSLOduration=75.849942114 podStartE2EDuration="1m15.849942114s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.807697927 +0000 UTC m=+115.896010710" watchObservedRunningTime="2026-03-20 12:29:13.849942114 +0000 UTC m=+115.938254897" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.850382 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" podStartSLOduration=75.850375276 podStartE2EDuration="1m15.850375276s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.846024885 +0000 UTC m=+115.934337678" watchObservedRunningTime="2026-03-20 12:29:13.850375276 +0000 UTC m=+115.938688059" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.864751 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.865574 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.365557439 +0000 UTC m=+116.453870212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.893730 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" podStartSLOduration=75.893715684 podStartE2EDuration="1m15.893715684s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.891994466 +0000 UTC m=+115.980307249" watchObservedRunningTime="2026-03-20 12:29:13.893715684 +0000 UTC m=+115.982028467" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.964662 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" podStartSLOduration=75.964644351 podStartE2EDuration="1m15.964644351s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.934987824 +0000 UTC m=+116.023300597" watchObservedRunningTime="2026-03-20 12:29:13.964644351 +0000 UTC m=+116.052957124" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.965592 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-crt6p" podStartSLOduration=75.965586927 podStartE2EDuration="1m15.965586927s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.964382754 +0000 UTC m=+116.052695537" watchObservedRunningTime="2026-03-20 12:29:13.965586927 +0000 UTC m=+116.053899710" Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.967145 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:13 crc kubenswrapper[4817]: E0320 12:29:13.985448 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.48540162 +0000 UTC m=+116.573714403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:13 crc kubenswrapper[4817]: I0320 12:29:13.995602 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podStartSLOduration=7.995565353 podStartE2EDuration="7.995565353s" podCreationTimestamp="2026-03-20 12:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:13.993324 +0000 UTC m=+116.081636783" watchObservedRunningTime="2026-03-20 12:29:13.995565353 +0000 UTC m=+116.083878136" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.084231 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.084520 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.584510302 +0000 UTC m=+116.672823085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.148353 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" podStartSLOduration=76.148337151 podStartE2EDuration="1m16.148337151s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.114041765 +0000 UTC m=+116.202354548" watchObservedRunningTime="2026-03-20 12:29:14.148337151 +0000 UTC m=+116.236649934" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.184913 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.185265 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.6852456 +0000 UTC m=+116.773558383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.289332 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.289656 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.789642669 +0000 UTC m=+116.877955452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.390306 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.390575 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.890551842 +0000 UTC m=+116.978864635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.390943 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.391356 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.891346074 +0000 UTC m=+116.979658867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.423434 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:14 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:14 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:14 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.423503 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.446918 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" podStartSLOduration=76.446903473 podStartE2EDuration="1m16.446903473s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.180473927 +0000 UTC m=+116.268786710" watchObservedRunningTime="2026-03-20 12:29:14.446903473 +0000 UTC m=+116.535216256" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.447079 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rwqck"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.492005 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.492150 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.992114033 +0000 UTC m=+117.080426816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.492314 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.492590 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:14.992583086 +0000 UTC m=+117.080895869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.502317 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.503227 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.517783 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.540833 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" event={"ID":"b3057700-8d5d-4e2e-8e0c-66490bfe55c5","Type":"ContainerStarted","Data":"8f7775992983031f0ce7270fdcfd5daaee9f5bf14f9534b308393a752fb60ea4"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.542010 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.545361 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qdnmj" event={"ID":"d0f2d38b-934b-4185-97ab-f43bfcbad479","Type":"ContainerStarted","Data":"dfe123556fc8d616fd8fab9c14ad91142fd65f5c05dd604fcdedc8255f9e92cc"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.547111 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76blx" event={"ID":"b0f4ee62-4cd8-4085-855e-9304b9ce5018","Type":"ContainerStarted","Data":"4ec59d859b809b917c4282c0cfdc2062faf16d90de1984f17ce7732af8e6e1fd"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.547147 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76blx" event={"ID":"b0f4ee62-4cd8-4085-855e-9304b9ce5018","Type":"ContainerStarted","Data":"de6188bc7309f7f38cd6aebd4f9515dd7a3bde64549feb756524f89f8520be5c"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.547250 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-76blx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.548764 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hrbgf" event={"ID":"413aaaf0-f437-4a19-845a-6b5c1bfccd07","Type":"ContainerStarted","Data":"24b58d5de95c1835ade0dd915abc1081e3022489a041611dcd99323a2963dfda"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.556031 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xq7wp" event={"ID":"c9cb3896-c4ff-4ccb-b494-eac8b4460342","Type":"ContainerStarted","Data":"8ed08be3f15e3e609209da7ad56d5263cc2d64b679b94b16d182df1bbd5470ad"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.556068 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xq7wp" event={"ID":"c9cb3896-c4ff-4ccb-b494-eac8b4460342","Type":"ContainerStarted","Data":"23a0efe945df0fd0cab1a845e4d3f30b469994e61ae49871334d837033accc12"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.556084 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xq7wp" event={"ID":"c9cb3896-c4ff-4ccb-b494-eac8b4460342","Type":"ContainerStarted","Data":"adcdfd965f720c3c5eb429313442770c89d8eecb486c916667bc8d6e9a10d3af"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.563510 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6pc55" event={"ID":"122a38f4-116c-41d1-926d-7c1c3d6ba167","Type":"ContainerStarted","Data":"581f20fadc6e954569bc6721628800b2e7fe4557091a90d53f843e5a5821a8a1"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.566779 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" event={"ID":"1907dc62-9c82-492c-a7b9-642410848e1f","Type":"ContainerStarted","Data":"9743fd79c059fc7bff355b5e49dc200b83e55ff7a71c9e590b25b90121231783"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.569406 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" event={"ID":"87a729e2-4d7e-4f68-bef5-ef25d4ab5b43","Type":"ContainerStarted","Data":"cac4a0d0b2c162ff8e861d2c54238144b4809b25941fe27e8541f0310d7f6744"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.571320 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" event={"ID":"344f693a-912d-41fe-a9f8-c344e2770d08","Type":"ContainerStarted","Data":"55d371b0186a684fd6c1e39ea0581a8b2508509b0eecc40324585a72d9c09a58"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.574292 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-prq7r" event={"ID":"3942ee53-6987-42c9-85ed-e5b799a1555d","Type":"ContainerStarted","Data":"51910addbdae2101035dd3483205cffe7805071cd78f8148456101073f5d02ce"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.580160 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" event={"ID":"2713a227-3462-4d3d-86ff-6cb0101ef6be","Type":"ContainerStarted","Data":"abd52a5bd7af56db8db11e6f1a698392c5fb8e9d317d071d00348c619b6c269d"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.582179 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-55j94" event={"ID":"70b7229d-a4be-4ee2-9b94-d7bc011986d7","Type":"ContainerStarted","Data":"86dda292f986cb6763022ed69a325a9134143826880b8a05bd70e63b4e8fc31d"} Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.582856 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" podUID="f5457083-eb9e-4828-839c-a7613592278e" containerName="route-controller-manager" containerID="cri-o://832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e" gracePeriod=30 Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.587334 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" podUID="df2df950-540c-408e-a555-81b7e7da9e26" containerName="controller-manager" containerID="cri-o://04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4" gracePeriod=30 Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.587726 4817 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bqnjh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.587750 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.595636 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.597266 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.097249822 +0000 UTC m=+117.185562595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.597934 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zqvg6" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.607493 4817 ???:1] "http: TLS handshake error from 192.168.126.11:43106: no serving certificate available for the kubelet" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.682997 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xq7wp" podStartSLOduration=76.682981642 podStartE2EDuration="1m16.682981642s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.6793474 +0000 UTC m=+116.767660193" watchObservedRunningTime="2026-03-20 12:29:14.682981642 +0000 UTC m=+116.771294425" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.696880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.697113 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.697161 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd68d\" (UniqueName: \"kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.697443 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.699211 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.199193553 +0000 UTC m=+117.287506326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.746231 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.774377 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.775093 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.787739 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" podStartSLOduration=77.787720701 podStartE2EDuration="1m17.787720701s" podCreationTimestamp="2026-03-20 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.781039325 +0000 UTC m=+116.869352108" watchObservedRunningTime="2026-03-20 12:29:14.787720701 +0000 UTC m=+116.876033484" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.788319 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.844819 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-76blx" podStartSLOduration=7.844799162 podStartE2EDuration="7.844799162s" podCreationTimestamp="2026-03-20 12:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.819597969 +0000 UTC m=+116.907910752" watchObservedRunningTime="2026-03-20 12:29:14.844799162 +0000 UTC m=+116.933111945" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.851577 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.851743 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.351707154 +0000 UTC m=+117.440019947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.851857 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.851940 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.851968 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd68d\" (UniqueName: \"kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.851999 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.852027 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.852094 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.852219 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmld\" (UniqueName: \"kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.852649 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.852919 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.352910628 +0000 UTC m=+117.441223421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.853434 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.942145 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.943509 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.952709 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9lk4" podStartSLOduration=76.952685689 podStartE2EDuration="1m16.952685689s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:14.944051978 +0000 UTC m=+117.032364761" watchObservedRunningTime="2026-03-20 12:29:14.952685689 +0000 UTC m=+117.040998472" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.953761 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.954020 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmld\" (UniqueName: \"kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.954103 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.954138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: E0320 12:29:14.954300 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.454272463 +0000 UTC m=+117.542585256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.955096 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.968076 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd68d\" (UniqueName: \"kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d\") pod \"certified-operators-sg4sx\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:14 crc kubenswrapper[4817]: I0320 12:29:14.974369 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.073070 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmld\" (UniqueName: \"kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld\") pod \"community-operators-52mbr\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.076062 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.086857 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62f2\" (UniqueName: \"kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.086933 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.086956 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.087000 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.087298 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.58728403 +0000 UTC m=+117.675596813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.100194 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.101823 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.116450 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.116726 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" podStartSLOduration=77.11670145 podStartE2EDuration="1m17.11670145s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:15.095416607 +0000 UTC m=+117.183729390" watchObservedRunningTime="2026-03-20 12:29:15.11670145 +0000 UTC m=+117.205014233" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.140235 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.140865 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.193765 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194048 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194095 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62f2\" (UniqueName: \"kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69fn\" (UniqueName: \"kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194168 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194202 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194236 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194546 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.194641 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.694621512 +0000 UTC m=+117.782934295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.194846 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.259216 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62f2\" (UniqueName: \"kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2\") pod \"certified-operators-qjbdq\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.285642 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.299560 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.299605 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.299672 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x69fn\" (UniqueName: \"kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.299703 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.300168 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.300456 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.800442701 +0000 UTC m=+117.888755484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.300823 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.309161 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.329875 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69fn\" (UniqueName: \"kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn\") pod \"community-operators-fb87f\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404514 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404612 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkcl2\" (UniqueName: \"kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2\") pod \"df2df950-540c-408e-a555-81b7e7da9e26\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404644 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert\") pod \"df2df950-540c-408e-a555-81b7e7da9e26\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404664 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles\") pod \"df2df950-540c-408e-a555-81b7e7da9e26\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca\") pod \"df2df950-540c-408e-a555-81b7e7da9e26\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.404742 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config\") pod \"df2df950-540c-408e-a555-81b7e7da9e26\" (UID: \"df2df950-540c-408e-a555-81b7e7da9e26\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.406362 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config" (OuterVolumeSpecName: "config") pod "df2df950-540c-408e-a555-81b7e7da9e26" (UID: "df2df950-540c-408e-a555-81b7e7da9e26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.406503 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:15.906488327 +0000 UTC m=+117.994801110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.412818 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "df2df950-540c-408e-a555-81b7e7da9e26" (UID: "df2df950-540c-408e-a555-81b7e7da9e26"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.412888 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca" (OuterVolumeSpecName: "client-ca") pod "df2df950-540c-408e-a555-81b7e7da9e26" (UID: "df2df950-540c-408e-a555-81b7e7da9e26"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.420516 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.420727 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2df950-540c-408e-a555-81b7e7da9e26" containerName="controller-manager" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.420744 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2df950-540c-408e-a555-81b7e7da9e26" containerName="controller-manager" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.423089 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2df950-540c-408e-a555-81b7e7da9e26" containerName="controller-manager" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.423564 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.430861 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:15 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:15 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:15 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.431052 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.432411 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2" (OuterVolumeSpecName: "kube-api-access-fkcl2") pod "df2df950-540c-408e-a555-81b7e7da9e26" (UID: "df2df950-540c-408e-a555-81b7e7da9e26"). InnerVolumeSpecName "kube-api-access-fkcl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.440741 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df2df950-540c-408e-a555-81b7e7da9e26" (UID: "df2df950-540c-408e-a555-81b7e7da9e26"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.457560 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.506879 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507020 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507177 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507269 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507354 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507426 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tv27\" (UniqueName: \"kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507563 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkcl2\" (UniqueName: \"kubernetes.io/projected/df2df950-540c-408e-a555-81b7e7da9e26-kube-api-access-fkcl2\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507629 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df2df950-540c-408e-a555-81b7e7da9e26-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507690 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.507707 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.007695238 +0000 UTC m=+118.096008021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507833 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.507891 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df2df950-540c-408e-a555-81b7e7da9e26-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.523911 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.540014 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.570220 4817 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cl2qb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.570280 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" podUID="1907dc62-9c82-492c-a7b9-642410848e1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.611908 4817 generic.go:334] "Generic (PLEG): container finished" podID="df2df950-540c-408e-a555-81b7e7da9e26" containerID="04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4" exitCode=0 Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.611971 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" event={"ID":"df2df950-540c-408e-a555-81b7e7da9e26","Type":"ContainerDied","Data":"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4"} Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.611995 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" event={"ID":"df2df950-540c-408e-a555-81b7e7da9e26","Type":"ContainerDied","Data":"c3f13b771077ca9e983dc21573c89eccf2f33e9a49c82d884223486ae67ff094"} Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612011 4817 scope.go:117] "RemoveContainer" containerID="04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612091 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jsgjh" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612776 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert\") pod \"f5457083-eb9e-4828-839c-a7613592278e\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612824 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config\") pod \"f5457083-eb9e-4828-839c-a7613592278e\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612883 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69l2w\" (UniqueName: \"kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w\") pod \"f5457083-eb9e-4828-839c-a7613592278e\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.612997 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613029 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca\") pod \"f5457083-eb9e-4828-839c-a7613592278e\" (UID: \"f5457083-eb9e-4828-839c-a7613592278e\") " Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613204 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613244 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613265 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tv27\" (UniqueName: \"kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613305 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.613332 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.614471 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.618053 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config" (OuterVolumeSpecName: "config") pod "f5457083-eb9e-4828-839c-a7613592278e" (UID: "f5457083-eb9e-4828-839c-a7613592278e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.624581 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f5457083-eb9e-4828-839c-a7613592278e" (UID: "f5457083-eb9e-4828-839c-a7613592278e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.624704 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.124683789 +0000 UTC m=+118.212996572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.624880 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.624906 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca" (OuterVolumeSpecName: "client-ca") pod "f5457083-eb9e-4828-839c-a7613592278e" (UID: "f5457083-eb9e-4828-839c-a7613592278e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.627181 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.630847 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.647263 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w" (OuterVolumeSpecName: "kube-api-access-69l2w") pod "f5457083-eb9e-4828-839c-a7613592278e" (UID: "f5457083-eb9e-4828-839c-a7613592278e"). InnerVolumeSpecName "kube-api-access-69l2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.650110 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-55j94" event={"ID":"70b7229d-a4be-4ee2-9b94-d7bc011986d7","Type":"ContainerStarted","Data":"a1e6dc4635339bc3b87bad961fca90276c7a247137f30fa8a5eab2a2d302b3b8"} Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.657825 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tv27\" (UniqueName: \"kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27\") pod \"controller-manager-7556f5bcf-56lwt\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.674436 4817 generic.go:334] "Generic (PLEG): container finished" podID="f5457083-eb9e-4828-839c-a7613592278e" containerID="832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e" exitCode=0 Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.680234 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.680435 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" event={"ID":"f5457083-eb9e-4828-839c-a7613592278e","Type":"ContainerDied","Data":"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e"} Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.680491 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg" event={"ID":"f5457083-eb9e-4828-839c-a7613592278e","Type":"ContainerDied","Data":"aca45b8bff85d9946fb90ceee630dd670d8881026cf8647befd776e32d866e6a"} Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.683191 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" gracePeriod=30 Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.720333 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h694w" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.720425 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.740444 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.740875 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69l2w\" (UniqueName: \"kubernetes.io/projected/f5457083-eb9e-4828-839c-a7613592278e-kube-api-access-69l2w\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.740989 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.741065 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5457083-eb9e-4828-839c-a7613592278e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.741238 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5457083-eb9e-4828-839c-a7613592278e-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.741789 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.241771012 +0000 UTC m=+118.330083795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.744809 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jsgjh"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.775654 4817 scope.go:117] "RemoveContainer" containerID="04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.780335 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4\": container with ID starting with 04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4 not found: ID does not exist" containerID="04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.780391 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4"} err="failed to get container status \"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4\": rpc error: code = NotFound desc = could not find container \"04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4\": container with ID starting with 04ee0a5c4bf3b02d2fe9ac7dafb951b8e659511e71d8cf5a5229931cdc0f27a4 not found: ID does not exist" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.780435 4817 scope.go:117] "RemoveContainer" containerID="832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.800987 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.845692 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.847364 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.347348895 +0000 UTC m=+118.435661678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.883843 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.901576 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vt6cg"] Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.907769 4817 scope.go:117] "RemoveContainer" containerID="832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.912954 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e\": container with ID starting with 832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e not found: ID does not exist" containerID="832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.912998 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e"} err="failed to get container status \"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e\": rpc error: code = NotFound desc = could not find container \"832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e\": container with ID starting with 832ff93d6bfb0a67c7ee312a2ce8f925416d8f3dc3646c77bac551e797edf33e not found: ID does not exist" Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.951198 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:15 crc kubenswrapper[4817]: E0320 12:29:15.951531 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.451514188 +0000 UTC m=+118.539826971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:15 crc kubenswrapper[4817]: I0320 12:29:15.987344 4817 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.054195 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.054494 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.554476768 +0000 UTC m=+118.642789551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.055926 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.079513 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.112266 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cl2qb" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.133592 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.155309 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.155778 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.655759871 +0000 UTC m=+118.744072654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.239345 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.257318 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.257767 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.757740303 +0000 UTC m=+118.846053086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.354552 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.358304 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.358636 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.858624895 +0000 UTC m=+118.946937678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.423345 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:16 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:16 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:16 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.423405 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.459943 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.460255 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.960235967 +0000 UTC m=+119.048548750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.460310 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.460621 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:16.960613728 +0000 UTC m=+119.048926511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.561185 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.561453 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.061401597 +0000 UTC m=+119.149714380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.562309 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.562868 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.062855948 +0000 UTC m=+119.151168911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.664412 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.164384027 +0000 UTC m=+119.252696810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.664040 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.664874 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.665554 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.16554232 +0000 UTC m=+119.253855103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.679182 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2df950-540c-408e-a555-81b7e7da9e26" path="/var/lib/kubelet/pods/df2df950-540c-408e-a555-81b7e7da9e26/volumes" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.680061 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5457083-eb9e-4828-839c-a7613592278e" path="/var/lib/kubelet/pods/f5457083-eb9e-4828-839c-a7613592278e/volumes" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.687490 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.687860 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5457083-eb9e-4828-839c-a7613592278e" containerName="route-controller-manager" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.687886 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5457083-eb9e-4828-839c-a7613592278e" containerName="route-controller-manager" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.688065 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5457083-eb9e-4828-839c-a7613592278e" containerName="route-controller-manager" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.689058 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerStarted","Data":"3002d49f2c49e22e53d595b3cd36a8c0c7db2089a4bc0017d36c3b5b80df5325"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.689248 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.691319 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.694951 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-55j94" event={"ID":"70b7229d-a4be-4ee2-9b94-d7bc011986d7","Type":"ContainerStarted","Data":"272513ba5edde7fbff00ddb8e727788a8f6d6c561497f646aeb969feea214b9f"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.698638 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerStarted","Data":"06d34d7c10958f4e924fc0ef3ee77766aa5bd4c935967d9d274be0ddfc1da8ae"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.701602 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" event={"ID":"4812fe5b-83ec-47ef-9032-8cd964146208","Type":"ContainerStarted","Data":"54f518fa59b0850939c963a6a9e78a77ddbe8de8d71aa8ec6e8d2022461fe5b7"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.707977 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.710908 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.712913 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerStarted","Data":"1bc22e3c69c1b77b2052a1139281a6650ff94a22d4dd5765131a8f823d5a35d9"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.716597 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerStarted","Data":"2e71738029eb6ff1dc463a53a52d058c153f0648f37bf4d8a91bdc49c61093c0"} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.766805 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.767309 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.267263075 +0000 UTC m=+119.355575868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.767470 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.767586 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.767827 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.767950 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4n2\" (UniqueName: \"kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: E0320 12:29:16.767965 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 12:29:17.267949134 +0000 UTC m=+119.356262147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tcdxf" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.838724 4817 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T12:29:15.987364077Z","Handler":null,"Name":""} Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.847491 4817 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.847532 4817 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.869739 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.869999 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.870068 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4n2\" (UniqueName: \"kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.870265 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.871436 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.871513 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.875465 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.894464 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4n2\" (UniqueName: \"kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2\") pod \"redhat-marketplace-8cgvz\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.972289 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.976244 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 12:29:16 crc kubenswrapper[4817]: I0320 12:29:16.976309 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.011883 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.013035 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tcdxf\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.084402 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.085653 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.099772 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.178919 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnb57\" (UniqueName: \"kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.178977 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.179106 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.181302 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.198222 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.198190646 podStartE2EDuration="1.198190646s" podCreationTimestamp="2026-03-20 12:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:17.192023584 +0000 UTC m=+119.280336367" watchObservedRunningTime="2026-03-20 12:29:17.198190646 +0000 UTC m=+119.286503429" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.281532 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.281624 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnb57\" (UniqueName: \"kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.281643 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.282329 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.282563 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.313571 4817 ???:1] "http: TLS handshake error from 192.168.126.11:43108: no serving certificate available for the kubelet" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.321981 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnb57\" (UniqueName: \"kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57\") pod \"redhat-marketplace-tqqkf\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.421952 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:17 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:17 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:17 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.422058 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.423476 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.450023 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:29:17 crc kubenswrapper[4817]: W0320 12:29:17.481854 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21934de0_bbff_4fcf_ad45_b3a6a2461030.slice/crio-b03e99425704d8d2a366d6629c056c159a7bc4aaea203519dbd53ff607bab8d8 WatchSource:0}: Error finding container b03e99425704d8d2a366d6629c056c159a7bc4aaea203519dbd53ff607bab8d8: Status 404 returned error can't find the container with id b03e99425704d8d2a366d6629c056c159a7bc4aaea203519dbd53ff607bab8d8 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.556032 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:29:17 crc kubenswrapper[4817]: W0320 12:29:17.614722 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae6b6df_ce5b_473f_b03d_07b9d4380961.slice/crio-63b7d687eb9b8a7adb8422e2d6ef4d469b609ef0a9199d1a45f5f9631465fbcd WatchSource:0}: Error finding container 63b7d687eb9b8a7adb8422e2d6ef4d469b609ef0a9199d1a45f5f9631465fbcd: Status 404 returned error can't find the container with id 63b7d687eb9b8a7adb8422e2d6ef4d469b609ef0a9199d1a45f5f9631465fbcd Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.666343 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.667724 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.670630 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.689195 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.690209 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.691391 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.693399 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.696214 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.702896 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.750920 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-55j94" event={"ID":"70b7229d-a4be-4ee2-9b94-d7bc011986d7","Type":"ContainerStarted","Data":"a60315e98bfd04102b4851b301d7150d3b3e39fd68312d0f08e47bebb677ae62"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.754513 4817 generic.go:334] "Generic (PLEG): container finished" podID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerID="5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87" exitCode=0 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.754574 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerDied","Data":"5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.757687 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.767159 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" event={"ID":"4812fe5b-83ec-47ef-9032-8cd964146208","Type":"ContainerStarted","Data":"d0a9c826a52b0048a1b2f184cae1303810a51ac3260388dbde062f9e696eeb20"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.769866 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.797285 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-55j94" podStartSLOduration=11.797265943 podStartE2EDuration="11.797265943s" podCreationTimestamp="2026-03-20 12:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:17.773740557 +0000 UTC m=+119.862053350" watchObservedRunningTime="2026-03-20 12:29:17.797265943 +0000 UTC m=+119.885578726" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.802289 4817 generic.go:334] "Generic (PLEG): container finished" podID="b6ef9afb-80f9-48dd-b41d-47874fcf3be9" containerID="787f0d850343330ae3e554b7ff9fa4cbf93eb832f1871de3208bb6e0fbd037eb" exitCode=0 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.802418 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" event={"ID":"b6ef9afb-80f9-48dd-b41d-47874fcf3be9","Type":"ContainerDied","Data":"787f0d850343330ae3e554b7ff9fa4cbf93eb832f1871de3208bb6e0fbd037eb"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.805859 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.807474 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.807538 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7fk\" (UniqueName: \"kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.807586 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.807645 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.807696 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.847200 4817 generic.go:334] "Generic (PLEG): container finished" podID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerID="ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe" exitCode=0 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.847307 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerDied","Data":"ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.851686 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.853002 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.853318 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.855290 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" event={"ID":"bae6b6df-ce5b-473f-b03d-07b9d4380961","Type":"ContainerStarted","Data":"63b7d687eb9b8a7adb8422e2d6ef4d469b609ef0a9199d1a45f5f9631465fbcd"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.855417 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.855505 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" podStartSLOduration=4.855495116 podStartE2EDuration="4.855495116s" podCreationTimestamp="2026-03-20 12:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:17.846607598 +0000 UTC m=+119.934920381" watchObservedRunningTime="2026-03-20 12:29:17.855495116 +0000 UTC m=+119.943807899" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.859628 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.859687 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.859847 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.859946 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.860019 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.869165 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerStarted","Data":"eaa046d5703f712a98eb9301405f61bccdd8829d572d806398c85723f0a5e146"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.869218 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerStarted","Data":"b03e99425704d8d2a366d6629c056c159a7bc4aaea203519dbd53ff607bab8d8"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.869387 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.894396 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerID="c4e6ff2eccbddf87b22af1daac19c2acad517e0394af07082a881dd04748e654" exitCode=0 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.894504 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerDied","Data":"c4e6ff2eccbddf87b22af1daac19c2acad517e0394af07082a881dd04748e654"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.904808 4817 generic.go:334] "Generic (PLEG): container finished" podID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerID="440749064c710205ebd5a7f89aae0c561114b9be1c40b20337232833b6b1cac9" exitCode=0 Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.905570 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerDied","Data":"440749064c710205ebd5a7f89aae0c561114b9be1c40b20337232833b6b1cac9"} Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.905604 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.909281 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.909389 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.909406 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7fk\" (UniqueName: \"kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.909435 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.909490 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.911730 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.911895 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.912430 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.964999 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7fk\" (UniqueName: \"kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk\") pod \"redhat-operators-7zb6t\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:17 crc kubenswrapper[4817]: I0320 12:29:17.975866 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.000057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.010897 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.011056 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcwxw\" (UniqueName: \"kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.011077 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.011105 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.014610 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.082796 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.085561 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.111467 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.113202 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcwxw\" (UniqueName: \"kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.113259 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.113293 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.113330 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.114603 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.115040 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.143947 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.168776 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" podStartSLOduration=80.168754186 podStartE2EDuration="1m20.168754186s" podCreationTimestamp="2026-03-20 12:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:18.167778589 +0000 UTC m=+120.256091372" watchObservedRunningTime="2026-03-20 12:29:18.168754186 +0000 UTC m=+120.257066969" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.192919 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcwxw\" (UniqueName: \"kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw\") pod \"route-controller-manager-84dfb8c874-xmthz\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.193295 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.214918 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.215036 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.215078 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6cp\" (UniqueName: \"kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.316341 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.316798 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6cp\" (UniqueName: \"kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.316852 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.317786 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.318041 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.362032 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6cp\" (UniqueName: \"kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp\") pod \"redhat-operators-j47ss\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.395249 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.403609 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.426311 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:18 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:18 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:18 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.426395 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.469331 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.709093 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.712318 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.712358 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.790909 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.798806 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.814861 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.823437 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.823708 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.892922 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.892988 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.936075 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.936894 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.995298 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.995392 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:18 crc kubenswrapper[4817]: I0320 12:29:18.995883 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.006088 4817 patch_prober.go:28] interesting pod/console-f9d7485db-mzrc4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.006227 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mzrc4" podUID="9351978b-90a5-48f6-ba2b-68e2c4f2c574" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.011375 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.013264 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.049489 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.051761 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerDied","Data":"eaa046d5703f712a98eb9301405f61bccdd8829d572d806398c85723f0a5e146"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.051615 4817 generic.go:334] "Generic (PLEG): container finished" podID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerID="eaa046d5703f712a98eb9301405f61bccdd8829d572d806398c85723f0a5e146" exitCode=0 Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.059042 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.05901619 podStartE2EDuration="1.05901619s" podCreationTimestamp="2026-03-20 12:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:19.050616756 +0000 UTC m=+121.138929529" watchObservedRunningTime="2026-03-20 12:29:19.05901619 +0000 UTC m=+121.147328973" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.060568 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.068723 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.132317 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" event={"ID":"c2c4d03c-5905-48b1-8013-97ad29496c2a","Type":"ContainerStarted","Data":"e84bf2e5d5b1c9a3e5402b41417dead1bd4e626f6ae3666ab4a7bf1793dd876d"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.149561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"25099cbe-233e-4097-bcb7-c271fa9c53c3","Type":"ContainerStarted","Data":"c4fa169d3b7db9b21c9836f8e486194785359ce57cf961c1a590a73b2158948e"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.180372 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerID="63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558" exitCode=0 Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.181658 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerDied","Data":"63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.181710 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerStarted","Data":"c3207e1eb6eab0ee10293971afdbbd0635c3767a9687beb575f071166d80b548"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.186215 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerStarted","Data":"75069fe013ef2233d4ea44594c5a419a2343a81a9b627f0cab51ff6bea4e1f26"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.186286 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerStarted","Data":"f33406ff95cf8aef48d22004d4cd507657d7b3bffa280beb3bb091be741dbe19"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.202095 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" event={"ID":"bae6b6df-ce5b-473f-b03d-07b9d4380961","Type":"ContainerStarted","Data":"8453e26be0ed7fff720b5e9e12356fde75fb42563dd80f24adb0c2a99f53747b"} Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.212986 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.295903 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.295951 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.315403 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.315456 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.322368 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.322444 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.329610 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.422443 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.450089 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:19 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:19 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:19 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.450154 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.646038 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.697347 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.725618 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") pod \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.725681 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") pod \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.725824 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-489bn\" (UniqueName: \"kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn\") pod \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\" (UID: \"b6ef9afb-80f9-48dd-b41d-47874fcf3be9\") " Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.730786 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6ef9afb-80f9-48dd-b41d-47874fcf3be9" (UID: "b6ef9afb-80f9-48dd-b41d-47874fcf3be9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.744348 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn" (OuterVolumeSpecName: "kube-api-access-489bn") pod "b6ef9afb-80f9-48dd-b41d-47874fcf3be9" (UID: "b6ef9afb-80f9-48dd-b41d-47874fcf3be9"). InnerVolumeSpecName "kube-api-access-489bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.745075 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6ef9afb-80f9-48dd-b41d-47874fcf3be9" (UID: "b6ef9afb-80f9-48dd-b41d-47874fcf3be9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.827932 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-489bn\" (UniqueName: \"kubernetes.io/projected/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-kube-api-access-489bn\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.827980 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.827995 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6ef9afb-80f9-48dd-b41d-47874fcf3be9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:19 crc kubenswrapper[4817]: I0320 12:29:19.902283 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.270778 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1","Type":"ContainerStarted","Data":"614efc28784ea693f09e3a68eab2d566d208006ade16cd765336fd7e8d566446"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.282615 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" event={"ID":"c2c4d03c-5905-48b1-8013-97ad29496c2a","Type":"ContainerStarted","Data":"37e971b33107cd00e5942222960978a1f92d9438d7fd279682582575700dfae1"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.284312 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.292235 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.292524 4817 generic.go:334] "Generic (PLEG): container finished" podID="430036d2-3961-4715-b47b-c7d670e2ee26" containerID="2bd411f8722e0bdb34167d5b6d5779b54a75851d87cb8671f4857a18f5d4ea74" exitCode=0 Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.292629 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerDied","Data":"2bd411f8722e0bdb34167d5b6d5779b54a75851d87cb8671f4857a18f5d4ea74"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.292658 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerStarted","Data":"0bbc360d2aebf4e2f930d24633eda829b9f63a75ae6148054ef6f3f782ebde93"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.303243 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" podStartSLOduration=7.303212268 podStartE2EDuration="7.303212268s" podCreationTimestamp="2026-03-20 12:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:20.301665365 +0000 UTC m=+122.389978158" watchObservedRunningTime="2026-03-20 12:29:20.303212268 +0000 UTC m=+122.391525051" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.310312 4817 generic.go:334] "Generic (PLEG): container finished" podID="25099cbe-233e-4097-bcb7-c271fa9c53c3" containerID="6688d58ff2f452302c010145af12946bc4f5581cbcc5f8f09a4171676ad90a24" exitCode=0 Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.310577 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"25099cbe-233e-4097-bcb7-c271fa9c53c3","Type":"ContainerDied","Data":"6688d58ff2f452302c010145af12946bc4f5581cbcc5f8f09a4171676ad90a24"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.321004 4817 generic.go:334] "Generic (PLEG): container finished" podID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerID="75069fe013ef2233d4ea44594c5a419a2343a81a9b627f0cab51ff6bea4e1f26" exitCode=0 Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.321177 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerDied","Data":"75069fe013ef2233d4ea44594c5a419a2343a81a9b627f0cab51ff6bea4e1f26"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.364425 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.370619 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566815-k5qhj" event={"ID":"b6ef9afb-80f9-48dd-b41d-47874fcf3be9","Type":"ContainerDied","Data":"7b0ab47123ca151fbe78250cf81d4223b6dc1f9cc68dfc4bc393e2a9ea9fb198"} Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.370671 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b0ab47123ca151fbe78250cf81d4223b6dc1f9cc68dfc4bc393e2a9ea9fb198" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.372276 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bwlwt" Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.378305 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gbltz" Mar 20 12:29:20 crc kubenswrapper[4817]: E0320 12:29:20.416963 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.423645 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:20 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:20 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:20 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:20 crc kubenswrapper[4817]: I0320 12:29:20.423787 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:20 crc kubenswrapper[4817]: E0320 12:29:20.424548 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:20 crc kubenswrapper[4817]: E0320 12:29:20.480317 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:20 crc kubenswrapper[4817]: E0320 12:29:20.480383 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.421689 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:21 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:21 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:21 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.421767 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.443320 4817 generic.go:334] "Generic (PLEG): container finished" podID="fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" containerID="626696d790c40efd9caad7c67aae491a8d2d1efea69b064fb75bd0e3817083dc" exitCode=0 Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.444099 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1","Type":"ContainerDied","Data":"626696d790c40efd9caad7c67aae491a8d2d1efea69b064fb75bd0e3817083dc"} Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.798084 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.899298 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir\") pod \"25099cbe-233e-4097-bcb7-c271fa9c53c3\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.899375 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access\") pod \"25099cbe-233e-4097-bcb7-c271fa9c53c3\" (UID: \"25099cbe-233e-4097-bcb7-c271fa9c53c3\") " Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.899415 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "25099cbe-233e-4097-bcb7-c271fa9c53c3" (UID: "25099cbe-233e-4097-bcb7-c271fa9c53c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.899862 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25099cbe-233e-4097-bcb7-c271fa9c53c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:21 crc kubenswrapper[4817]: I0320 12:29:21.907021 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "25099cbe-233e-4097-bcb7-c271fa9c53c3" (UID: "25099cbe-233e-4097-bcb7-c271fa9c53c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.002438 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25099cbe-233e-4097-bcb7-c271fa9c53c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.425232 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:22 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:22 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:22 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.425735 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.467873 4817 ???:1] "http: TLS handshake error from 192.168.126.11:43114: no serving certificate available for the kubelet" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.477797 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"25099cbe-233e-4097-bcb7-c271fa9c53c3","Type":"ContainerDied","Data":"c4fa169d3b7db9b21c9836f8e486194785359ce57cf961c1a590a73b2158948e"} Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.477849 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4fa169d3b7db9b21c9836f8e486194785359ce57cf961c1a590a73b2158948e" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.477850 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 12:29:22 crc kubenswrapper[4817]: I0320 12:29:22.911092 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.026627 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir\") pod \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.026757 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access\") pod \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\" (UID: \"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1\") " Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.026862 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" (UID: "fd38b944-7621-4f61-bdc2-ccf8e1af3bd1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.027334 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.040419 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" (UID: "fd38b944-7621-4f61-bdc2-ccf8e1af3bd1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.135276 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd38b944-7621-4f61-bdc2-ccf8e1af3bd1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.425549 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:23 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:23 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:23 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.425623 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.501805 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fd38b944-7621-4f61-bdc2-ccf8e1af3bd1","Type":"ContainerDied","Data":"614efc28784ea693f09e3a68eab2d566d208006ade16cd765336fd7e8d566446"} Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.501861 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="614efc28784ea693f09e3a68eab2d566d208006ade16cd765336fd7e8d566446" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.501940 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 12:29:23 crc kubenswrapper[4817]: I0320 12:29:23.692090 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.421662 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:24 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:24 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:24 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.422087 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.560947 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.561110 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.563912 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.574032 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.591898 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.591934 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.603692 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.894325 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 12:29:24 crc kubenswrapper[4817]: I0320 12:29:24.930535 4817 ???:1] "http: TLS handshake error from 192.168.126.11:40202: no serving certificate available for the kubelet" Mar 20 12:29:25 crc kubenswrapper[4817]: W0320 12:29:25.273775 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-62f26a9a8dc497f453ea7d5314711e800e637f6a0079eef0f89e24b8727a893c WatchSource:0}: Error finding container 62f26a9a8dc497f453ea7d5314711e800e637f6a0079eef0f89e24b8727a893c: Status 404 returned error can't find the container with id 62f26a9a8dc497f453ea7d5314711e800e637f6a0079eef0f89e24b8727a893c Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.423455 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:25 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:25 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:25 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.424094 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.452641 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-76blx" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.480088 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.491093 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:25 crc kubenswrapper[4817]: W0320 12:29:25.569351 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-25230715e5e1f029c5b7d3abe18f4014e0724f5cc273b3833b8e28538891497f WatchSource:0}: Error finding container 25230715e5e1f029c5b7d3abe18f4014e0724f5cc273b3833b8e28538891497f: Status 404 returned error can't find the container with id 25230715e5e1f029c5b7d3abe18f4014e0724f5cc273b3833b8e28538891497f Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.579549 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"25230715e5e1f029c5b7d3abe18f4014e0724f5cc273b3833b8e28538891497f"} Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.581061 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.581361 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"62f26a9a8dc497f453ea7d5314711e800e637f6a0079eef0f89e24b8727a893c"} Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.581966 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.678428 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 12:29:25 crc kubenswrapper[4817]: I0320 12:29:25.800217 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gvrbl" Mar 20 12:29:26 crc kubenswrapper[4817]: W0320 12:29:26.299493 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5e581b64d40a0250b76e8b7677fd8c5aeb700230bd756779668616a4490a609b WatchSource:0}: Error finding container 5e581b64d40a0250b76e8b7677fd8c5aeb700230bd756779668616a4490a609b: Status 404 returned error can't find the container with id 5e581b64d40a0250b76e8b7677fd8c5aeb700230bd756779668616a4490a609b Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.424012 4817 patch_prober.go:28] interesting pod/router-default-5444994796-nb552 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 12:29:26 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Mar 20 12:29:26 crc kubenswrapper[4817]: [+]process-running ok Mar 20 12:29:26 crc kubenswrapper[4817]: healthz check failed Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.424461 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nb552" podUID="9148329e-b499-499d-a1a6-b1ff1368de6c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.599384 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8799a164dbb6a95a0820ce812db3edda014631f9dffda428fe9e17deeb576416"} Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.599613 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.603433 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"82a14ea99a596d30cdf9c4293be5427c069c309353de049b6b2cd455734b455c"} Mar 20 12:29:26 crc kubenswrapper[4817]: I0320 12:29:26.609499 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5e581b64d40a0250b76e8b7677fd8c5aeb700230bd756779668616a4490a609b"} Mar 20 12:29:27 crc kubenswrapper[4817]: I0320 12:29:27.421474 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:27 crc kubenswrapper[4817]: I0320 12:29:27.425065 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nb552" Mar 20 12:29:28 crc kubenswrapper[4817]: I0320 12:29:28.935465 4817 patch_prober.go:28] interesting pod/console-f9d7485db-mzrc4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 12:29:28 crc kubenswrapper[4817]: I0320 12:29:28.935581 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mzrc4" podUID="9351978b-90a5-48f6-ba2b-68e2c4f2c574" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 12:29:29 crc kubenswrapper[4817]: I0320 12:29:29.314613 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:29 crc kubenswrapper[4817]: I0320 12:29:29.314709 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-2jblp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 20 12:29:29 crc kubenswrapper[4817]: I0320 12:29:29.314727 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:29 crc kubenswrapper[4817]: I0320 12:29:29.314768 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2jblp" podUID="928ba203-a815-4c6d-9097-e1eafd194ab0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 20 12:29:30 crc kubenswrapper[4817]: E0320 12:29:30.415683 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:30 crc kubenswrapper[4817]: E0320 12:29:30.418758 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:30 crc kubenswrapper[4817]: E0320 12:29:30.421386 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:30 crc kubenswrapper[4817]: E0320 12:29:30.421429 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.289676 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.290141 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" containerID="cri-o://d0a9c826a52b0048a1b2f184cae1303810a51ac3260388dbde062f9e696eeb20" gracePeriod=30 Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.336254 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.337082 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" containerID="cri-o://37e971b33107cd00e5942222960978a1f92d9438d7fd279682582575700dfae1" gracePeriod=30 Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.667647 4817 generic.go:334] "Generic (PLEG): container finished" podID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerID="37e971b33107cd00e5942222960978a1f92d9438d7fd279682582575700dfae1" exitCode=0 Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.670999 4817 generic.go:334] "Generic (PLEG): container finished" podID="4812fe5b-83ec-47ef-9032-8cd964146208" containerID="d0a9c826a52b0048a1b2f184cae1303810a51ac3260388dbde062f9e696eeb20" exitCode=0 Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.675109 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" event={"ID":"c2c4d03c-5905-48b1-8013-97ad29496c2a","Type":"ContainerDied","Data":"37e971b33107cd00e5942222960978a1f92d9438d7fd279682582575700dfae1"} Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.675161 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" event={"ID":"4812fe5b-83ec-47ef-9032-8cd964146208","Type":"ContainerDied","Data":"d0a9c826a52b0048a1b2f184cae1303810a51ac3260388dbde062f9e696eeb20"} Mar 20 12:29:32 crc kubenswrapper[4817]: I0320 12:29:32.729698 4817 ???:1] "http: TLS handshake error from 192.168.126.11:40218: no serving certificate available for the kubelet" Mar 20 12:29:34 crc kubenswrapper[4817]: I0320 12:29:34.682554 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"07ab371731443f40f7826801aef6231008ef5600fbb872b183dac0d00515e096"} Mar 20 12:29:35 crc kubenswrapper[4817]: I0320 12:29:35.802898 4817 patch_prober.go:28] interesting pod/controller-manager-7556f5bcf-56lwt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Mar 20 12:29:35 crc kubenswrapper[4817]: I0320 12:29:35.803271 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Mar 20 12:29:37 crc kubenswrapper[4817]: I0320 12:29:37.187046 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:29:38 crc kubenswrapper[4817]: I0320 12:29:38.199685 4817 patch_prober.go:28] interesting pod/route-controller-manager-84dfb8c874-xmthz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 12:29:38 crc kubenswrapper[4817]: I0320 12:29:38.201218 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 12:29:38 crc kubenswrapper[4817]: I0320 12:29:38.942667 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:38 crc kubenswrapper[4817]: I0320 12:29:38.948864 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mzrc4" Mar 20 12:29:39 crc kubenswrapper[4817]: I0320 12:29:39.322431 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2jblp" Mar 20 12:29:40 crc kubenswrapper[4817]: E0320 12:29:40.413619 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:40 crc kubenswrapper[4817]: E0320 12:29:40.420846 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:40 crc kubenswrapper[4817]: E0320 12:29:40.422489 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:40 crc kubenswrapper[4817]: E0320 12:29:40.422569 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:46 crc kubenswrapper[4817]: I0320 12:29:46.752774 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rwqck_3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f/kube-multus-additional-cni-plugins/0.log" Mar 20 12:29:46 crc kubenswrapper[4817]: I0320 12:29:46.752862 4817 generic.go:334] "Generic (PLEG): container finished" podID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" exitCode=137 Mar 20 12:29:46 crc kubenswrapper[4817]: I0320 12:29:46.752910 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" event={"ID":"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f","Type":"ContainerDied","Data":"2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf"} Mar 20 12:29:46 crc kubenswrapper[4817]: I0320 12:29:46.803020 4817 patch_prober.go:28] interesting pod/controller-manager-7556f5bcf-56lwt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:29:46 crc kubenswrapper[4817]: I0320 12:29:46.803182 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:29:49 crc kubenswrapper[4817]: I0320 12:29:49.194797 4817 patch_prober.go:28] interesting pod/route-controller-manager-84dfb8c874-xmthz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:29:49 crc kubenswrapper[4817]: I0320 12:29:49.194915 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:29:49 crc kubenswrapper[4817]: I0320 12:29:49.782998 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v8vzq" Mar 20 12:29:50 crc kubenswrapper[4817]: E0320 12:29:50.403451 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf is running failed: container process not found" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:50 crc kubenswrapper[4817]: E0320 12:29:50.403998 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf is running failed: container process not found" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:50 crc kubenswrapper[4817]: E0320 12:29:50.404347 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf is running failed: container process not found" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 12:29:50 crc kubenswrapper[4817]: E0320 12:29:50.404403 4817 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.593622 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 12:29:53 crc kubenswrapper[4817]: E0320 12:29:53.594602 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25099cbe-233e-4097-bcb7-c271fa9c53c3" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.594627 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="25099cbe-233e-4097-bcb7-c271fa9c53c3" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: E0320 12:29:53.594662 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.594675 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: E0320 12:29:53.594694 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ef9afb-80f9-48dd-b41d-47874fcf3be9" containerName="collect-profiles" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.594708 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ef9afb-80f9-48dd-b41d-47874fcf3be9" containerName="collect-profiles" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.594999 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd38b944-7621-4f61-bdc2-ccf8e1af3bd1" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.595023 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ef9afb-80f9-48dd-b41d-47874fcf3be9" containerName="collect-profiles" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.595039 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="25099cbe-233e-4097-bcb7-c271fa9c53c3" containerName="pruner" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.595896 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.601469 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.604400 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.604666 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.616926 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.617281 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.723533 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.723621 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.723693 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.742725 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:53 crc kubenswrapper[4817]: I0320 12:29:53.934248 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:29:56 crc kubenswrapper[4817]: E0320 12:29:56.004356 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 12:29:56 crc kubenswrapper[4817]: E0320 12:29:56.004968 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mm6cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j47ss_openshift-marketplace(430036d2-3961-4715-b47b-c7d670e2ee26): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:56 crc kubenswrapper[4817]: E0320 12:29:56.006238 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j47ss" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" Mar 20 12:29:56 crc kubenswrapper[4817]: I0320 12:29:56.802704 4817 patch_prober.go:28] interesting pod/controller-manager-7556f5bcf-56lwt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 12:29:56 crc kubenswrapper[4817]: I0320 12:29:56.803379 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.718647 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j47ss" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.814742 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.815226 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fgmld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-52mbr_openshift-marketplace(bd0c8df8-231e-4c91-8cfe-0182cd07e3d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.816724 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-52mbr" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.825680 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.825839 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wd68d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sg4sx_openshift-marketplace(98014d2a-ca27-4147-a4cf-081ce9325a83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.827334 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sg4sx" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.830413 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.840443 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rwqck_3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f/kube-multus-additional-cni-plugins/0.log" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.840518 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" event={"ID":"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f","Type":"ContainerDied","Data":"2696ac76ba9e4e36250aac2b2464370c0ba4db6cd66d732045ecbf369caa6bee"} Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.840544 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2696ac76ba9e4e36250aac2b2464370c0ba4db6cd66d732045ecbf369caa6bee" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.842208 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.847279 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" event={"ID":"c2c4d03c-5905-48b1-8013-97ad29496c2a","Type":"ContainerDied","Data":"e84bf2e5d5b1c9a3e5402b41417dead1bd4e626f6ae3666ab4a7bf1793dd876d"} Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.847390 4817 scope.go:117] "RemoveContainer" containerID="37e971b33107cd00e5942222960978a1f92d9438d7fd279682582575700dfae1" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.856639 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" event={"ID":"4812fe5b-83ec-47ef-9032-8cd964146208","Type":"ContainerDied","Data":"54f518fa59b0850939c963a6a9e78a77ddbe8de8d71aa8ec6e8d2022461fe5b7"} Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.856868 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7556f5bcf-56lwt" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.858426 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-rwqck_3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f/kube-multus-additional-cni-plugins/0.log" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.858480 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.861960 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.862326 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862351 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.862377 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862386 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.862406 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862415 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862548 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" containerName="controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862569 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" containerName="route-controller-manager" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.862582 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" containerName="kube-multus-additional-cni-plugins" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.863100 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.874770 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.902768 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.902997 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zh4n2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8cgvz_openshift-marketplace(21934de0-bbff-4fcf-ad45-b3a6a2461030): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903011 4817 scope.go:117] "RemoveContainer" containerID="d0a9c826a52b0048a1b2f184cae1303810a51ac3260388dbde062f9e696eeb20" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903592 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tv27\" (UniqueName: \"kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27\") pod \"4812fe5b-83ec-47ef-9032-8cd964146208\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903629 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles\") pod \"4812fe5b-83ec-47ef-9032-8cd964146208\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903661 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir\") pod \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert\") pod \"c2c4d03c-5905-48b1-8013-97ad29496c2a\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903734 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist\") pod \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903769 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcwxw\" (UniqueName: \"kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw\") pod \"c2c4d03c-5905-48b1-8013-97ad29496c2a\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903790 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j586r\" (UniqueName: \"kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r\") pod \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903807 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config\") pod \"4812fe5b-83ec-47ef-9032-8cd964146208\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903845 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready\") pod \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\" (UID: \"3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903863 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config\") pod \"c2c4d03c-5905-48b1-8013-97ad29496c2a\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903888 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert\") pod \"4812fe5b-83ec-47ef-9032-8cd964146208\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903911 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca\") pod \"4812fe5b-83ec-47ef-9032-8cd964146208\" (UID: \"4812fe5b-83ec-47ef-9032-8cd964146208\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.903931 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca\") pod \"c2c4d03c-5905-48b1-8013-97ad29496c2a\" (UID: \"c2c4d03c-5905-48b1-8013-97ad29496c2a\") " Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904088 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904106 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmphp\" (UniqueName: \"kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904294 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904327 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.904872 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready" (OuterVolumeSpecName: "ready") pod "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" (UID: "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.905575 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config" (OuterVolumeSpecName: "config") pod "c2c4d03c-5905-48b1-8013-97ad29496c2a" (UID: "c2c4d03c-5905-48b1-8013-97ad29496c2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.906281 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca" (OuterVolumeSpecName: "client-ca") pod "4812fe5b-83ec-47ef-9032-8cd964146208" (UID: "4812fe5b-83ec-47ef-9032-8cd964146208"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.906948 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config" (OuterVolumeSpecName: "config") pod "4812fe5b-83ec-47ef-9032-8cd964146208" (UID: "4812fe5b-83ec-47ef-9032-8cd964146208"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.907037 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8cgvz" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.908024 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" (UID: "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.908638 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" (UID: "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.914441 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4812fe5b-83ec-47ef-9032-8cd964146208" (UID: "4812fe5b-83ec-47ef-9032-8cd964146208"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.915066 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2c4d03c-5905-48b1-8013-97ad29496c2a" (UID: "c2c4d03c-5905-48b1-8013-97ad29496c2a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.917268 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r" (OuterVolumeSpecName: "kube-api-access-j586r") pod "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" (UID: "3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f"). InnerVolumeSpecName "kube-api-access-j586r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.917380 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw" (OuterVolumeSpecName: "kube-api-access-tcwxw") pod "c2c4d03c-5905-48b1-8013-97ad29496c2a" (UID: "c2c4d03c-5905-48b1-8013-97ad29496c2a"). InnerVolumeSpecName "kube-api-access-tcwxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.917389 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.917536 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x69fn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fb87f_openshift-marketplace(870bce80-0a37-4d5f-afbf-ffa85ea34a03): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.920470 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4812fe5b-83ec-47ef-9032-8cd964146208" (UID: "4812fe5b-83ec-47ef-9032-8cd964146208"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.921191 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fb87f" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.923775 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2c4d03c-5905-48b1-8013-97ad29496c2a" (UID: "c2c4d03c-5905-48b1-8013-97ad29496c2a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: I0320 12:29:57.929975 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27" (OuterVolumeSpecName: "kube-api-access-8tv27") pod "4812fe5b-83ec-47ef-9032-8cd964146208" (UID: "4812fe5b-83ec-47ef-9032-8cd964146208"). InnerVolumeSpecName "kube-api-access-8tv27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.960232 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.960383 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wt7fk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7zb6t_openshift-marketplace(790f1757-c8f1-4a1b-93aa-c476aed2e981): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 12:29:57 crc kubenswrapper[4817]: E0320 12:29:57.962445 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7zb6t" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.005721 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.005795 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.005865 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.005888 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.005907 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmphp\" (UniqueName: \"kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006009 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2c4d03c-5905-48b1-8013-97ad29496c2a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006024 4817 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006036 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcwxw\" (UniqueName: \"kubernetes.io/projected/c2c4d03c-5905-48b1-8013-97ad29496c2a-kube-api-access-tcwxw\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006046 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j586r\" (UniqueName: \"kubernetes.io/projected/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-kube-api-access-j586r\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006057 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006066 4817 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-ready\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006105 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006114 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4812fe5b-83ec-47ef-9032-8cd964146208-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006142 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006154 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4d03c-5905-48b1-8013-97ad29496c2a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006168 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tv27\" (UniqueName: \"kubernetes.io/projected/4812fe5b-83ec-47ef-9032-8cd964146208-kube-api-access-8tv27\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006182 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4812fe5b-83ec-47ef-9032-8cd964146208-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.006191 4817 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.007913 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.011198 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.011418 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.014246 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.024479 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmphp\" (UniqueName: \"kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp\") pod \"controller-manager-5979884754-x9xxh\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.052736 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.180523 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.187435 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.191044 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7556f5bcf-56lwt"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.387492 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.577160 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.580102 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.595922 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.613310 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.613423 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.613464 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.679891 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4812fe5b-83ec-47ef-9032-8cd964146208" path="/var/lib/kubelet/pods/4812fe5b-83ec-47ef-9032-8cd964146208/volumes" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.714795 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.714987 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.715142 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.715458 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.715579 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.737429 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access\") pod \"installer-9-crc\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.863281 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.865752 4817 generic.go:334] "Generic (PLEG): container finished" podID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerID="645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf" exitCode=0 Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.865831 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerDied","Data":"645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.868072 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerID="43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86" exitCode=0 Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.868177 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerDied","Data":"43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.873280 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" event={"ID":"4ca774b2-cbbd-4db9-82a9-820415902c3c","Type":"ContainerStarted","Data":"da7efccbc71abef4be4287757dc2df3ca27b09c8601ce8a7ea1c610e2ee822de"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.873334 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" event={"ID":"4ca774b2-cbbd-4db9-82a9-820415902c3c","Type":"ContainerStarted","Data":"78ac0dd30d2c99ec53ceb6f9453cf39bf8765a392e8455eeeeba8fb738869ebc"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.873492 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.879956 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06","Type":"ContainerStarted","Data":"84a48d99704ea7b4dbe38c31bd6e2dcdf8d355fc384977c6d5851f7d11f8b440"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.880011 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06","Type":"ContainerStarted","Data":"f914cfbe5270cb9cfc27b284c619d8d7191cb6638c19e50137bb2f3f0ab7b5b7"} Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.880673 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rwqck" Mar 20 12:29:58 crc kubenswrapper[4817]: E0320 12:29:58.881766 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7zb6t" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" Mar 20 12:29:58 crc kubenswrapper[4817]: E0320 12:29:58.882298 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-52mbr" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" Mar 20 12:29:58 crc kubenswrapper[4817]: E0320 12:29:58.882391 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fb87f" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" Mar 20 12:29:58 crc kubenswrapper[4817]: E0320 12:29:58.883533 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8cgvz" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" Mar 20 12:29:58 crc kubenswrapper[4817]: E0320 12:29:58.883675 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sg4sx" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.895859 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.938538 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.949679 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84dfb8c874-xmthz"] Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.954170 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:29:58 crc kubenswrapper[4817]: I0320 12:29:58.984827 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=5.984797005 podStartE2EDuration="5.984797005s" podCreationTimestamp="2026-03-20 12:29:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:58.984107575 +0000 UTC m=+161.072420358" watchObservedRunningTime="2026-03-20 12:29:58.984797005 +0000 UTC m=+161.073109788" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.004424 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" podStartSLOduration=7.004402693 podStartE2EDuration="7.004402693s" podCreationTimestamp="2026-03-20 12:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:59.001749936 +0000 UTC m=+161.090062719" watchObservedRunningTime="2026-03-20 12:29:59.004402693 +0000 UTC m=+161.092715476" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.074427 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rwqck"] Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.081431 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rwqck"] Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.221358 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.865050 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.866698 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.870433 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.870841 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.871074 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.871266 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.872334 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.872627 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.887805 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.895969 4817 generic.go:334] "Generic (PLEG): container finished" podID="8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" containerID="84a48d99704ea7b4dbe38c31bd6e2dcdf8d355fc384977c6d5851f7d11f8b440" exitCode=0 Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.896693 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06","Type":"ContainerDied","Data":"84a48d99704ea7b4dbe38c31bd6e2dcdf8d355fc384977c6d5851f7d11f8b440"} Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.912733 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e36838e6-8822-4760-a052-b888667f5a14","Type":"ContainerStarted","Data":"6fdeb4ea13193a05df960c3380d31afda71a7344c20f3aeabd012396f91e8414"} Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.912786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e36838e6-8822-4760-a052-b888667f5a14","Type":"ContainerStarted","Data":"81d779dddb471c2dd437bbfbbbc642362d9a0dc546d028e4d8366ace2aab8fb6"} Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.915887 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerStarted","Data":"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31"} Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.926181 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerStarted","Data":"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672"} Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.929448 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.929506 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.929654 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.929757 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdnx\" (UniqueName: \"kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.941842 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.9418216240000001 podStartE2EDuration="1.941821624s" podCreationTimestamp="2026-03-20 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:29:59.941625658 +0000 UTC m=+162.029938441" watchObservedRunningTime="2026-03-20 12:29:59.941821624 +0000 UTC m=+162.030134407" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.966855 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjbdq" podStartSLOduration=4.397731808 podStartE2EDuration="45.966834899s" podCreationTimestamp="2026-03-20 12:29:14 +0000 UTC" firstStartedPulling="2026-03-20 12:29:17.757373621 +0000 UTC m=+119.845686404" lastFinishedPulling="2026-03-20 12:29:59.326476712 +0000 UTC m=+161.414789495" observedRunningTime="2026-03-20 12:29:59.961513775 +0000 UTC m=+162.049826548" watchObservedRunningTime="2026-03-20 12:29:59.966834899 +0000 UTC m=+162.055147682" Mar 20 12:29:59 crc kubenswrapper[4817]: I0320 12:29:59.986545 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tqqkf" podStartSLOduration=2.753079242 podStartE2EDuration="42.98652852s" podCreationTimestamp="2026-03-20 12:29:17 +0000 UTC" firstStartedPulling="2026-03-20 12:29:19.185260028 +0000 UTC m=+121.273572801" lastFinishedPulling="2026-03-20 12:29:59.418709286 +0000 UTC m=+161.507022079" observedRunningTime="2026-03-20 12:29:59.983384619 +0000 UTC m=+162.071697422" watchObservedRunningTime="2026-03-20 12:29:59.98652852 +0000 UTC m=+162.074841303" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.030692 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdnx\" (UniqueName: \"kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.030801 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.030838 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.031972 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.032420 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.032995 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.039711 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.049682 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdnx\" (UniqueName: \"kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx\") pod \"route-controller-manager-86956f7bdb-pxfnv\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.144854 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566830-tj7ps"] Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.145981 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.148920 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.149608 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lqzqd" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.150975 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.154934 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566830-tj7ps"] Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.199776 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.235990 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgjn6\" (UniqueName: \"kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6\") pod \"auto-csr-approver-29566830-tj7ps\" (UID: \"a41c54b1-d599-494b-8476-a2700bb5424f\") " pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.245613 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm"] Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.246324 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.248320 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.249598 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.265195 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm"] Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.337642 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.337704 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kc7r\" (UniqueName: \"kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.337772 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgjn6\" (UniqueName: \"kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6\") pod \"auto-csr-approver-29566830-tj7ps\" (UID: \"a41c54b1-d599-494b-8476-a2700bb5424f\") " pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.338005 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.373940 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgjn6\" (UniqueName: \"kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6\") pod \"auto-csr-approver-29566830-tj7ps\" (UID: \"a41c54b1-d599-494b-8476-a2700bb5424f\") " pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.440311 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.440593 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.440632 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kc7r\" (UniqueName: \"kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.441219 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.448034 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.459754 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kc7r\" (UniqueName: \"kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r\") pod \"collect-profiles-29566830-nbdvm\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.465327 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.579748 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.703796 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f" path="/var/lib/kubelet/pods/3c6ee1e3-c1d7-4b79-8d28-dae700f3de2f/volumes" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.704590 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c4d03c-5905-48b1-8013-97ad29496c2a" path="/var/lib/kubelet/pods/c2c4d03c-5905-48b1-8013-97ad29496c2a/volumes" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.705044 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:30:00 crc kubenswrapper[4817]: W0320 12:30:00.706738 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519193a3_0c3a_4736_a169_233cc2d89766.slice/crio-632822be3abb0f66f93c309bc7d1fc01186230fd86969c581ad0ab09d3fa499d WatchSource:0}: Error finding container 632822be3abb0f66f93c309bc7d1fc01186230fd86969c581ad0ab09d3fa499d: Status 404 returned error can't find the container with id 632822be3abb0f66f93c309bc7d1fc01186230fd86969c581ad0ab09d3fa499d Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.734808 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566830-tj7ps"] Mar 20 12:30:00 crc kubenswrapper[4817]: W0320 12:30:00.765817 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda41c54b1_d599_494b_8476_a2700bb5424f.slice/crio-9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581 WatchSource:0}: Error finding container 9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581: Status 404 returned error can't find the container with id 9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581 Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.931457 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" event={"ID":"519193a3-0c3a-4736-a169-233cc2d89766","Type":"ContainerStarted","Data":"f13a95fbd986182e392491f10b939785a86d6112916ffeafa183c84cc7164d4a"} Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.932031 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.932078 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" event={"ID":"519193a3-0c3a-4736-a169-233cc2d89766","Type":"ContainerStarted","Data":"632822be3abb0f66f93c309bc7d1fc01186230fd86969c581ad0ab09d3fa499d"} Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.932750 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" event={"ID":"a41c54b1-d599-494b-8476-a2700bb5424f","Type":"ContainerStarted","Data":"9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581"} Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.933982 4817 patch_prober.go:28] interesting pod/route-controller-manager-86956f7bdb-pxfnv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 20 12:30:00 crc kubenswrapper[4817]: I0320 12:30:00.934024 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" podUID="519193a3-0c3a-4736-a169-233cc2d89766" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.046039 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" podStartSLOduration=9.0460194 podStartE2EDuration="9.0460194s" podCreationTimestamp="2026-03-20 12:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:30:00.955586518 +0000 UTC m=+163.043899301" watchObservedRunningTime="2026-03-20 12:30:01.0460194 +0000 UTC m=+163.134332183" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.046650 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm"] Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.219603 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.263749 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir\") pod \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.264328 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access\") pod \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\" (UID: \"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06\") " Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.264257 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" (UID: "8fd869c8-5c07-4b3d-b46b-11c21ddf7e06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.267298 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.277487 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" (UID: "8fd869c8-5c07-4b3d-b46b-11c21ddf7e06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.368798 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8fd869c8-5c07-4b3d-b46b-11c21ddf7e06-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.949692 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.949683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8fd869c8-5c07-4b3d-b46b-11c21ddf7e06","Type":"ContainerDied","Data":"f914cfbe5270cb9cfc27b284c619d8d7191cb6638c19e50137bb2f3f0ab7b5b7"} Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.950755 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f914cfbe5270cb9cfc27b284c619d8d7191cb6638c19e50137bb2f3f0ab7b5b7" Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.957964 4817 generic.go:334] "Generic (PLEG): container finished" podID="cf6302c9-3377-4a44-bf25-6acf56bb86d9" containerID="96e927cdda04ab1a80bc54c57526932892c122cc8fbd35b52a90f33f6f28da40" exitCode=0 Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.958068 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" event={"ID":"cf6302c9-3377-4a44-bf25-6acf56bb86d9","Type":"ContainerDied","Data":"96e927cdda04ab1a80bc54c57526932892c122cc8fbd35b52a90f33f6f28da40"} Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.958184 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" event={"ID":"cf6302c9-3377-4a44-bf25-6acf56bb86d9","Type":"ContainerStarted","Data":"dab47014c467d251452a1d41965bd504d31530f77efbf8b018b8c4ed9a726b6e"} Mar 20 12:30:01 crc kubenswrapper[4817]: I0320 12:30:01.968755 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.244429 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.300209 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume\") pod \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.300319 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kc7r\" (UniqueName: \"kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r\") pod \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.300385 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume\") pod \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\" (UID: \"cf6302c9-3377-4a44-bf25-6acf56bb86d9\") " Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.301686 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf6302c9-3377-4a44-bf25-6acf56bb86d9" (UID: "cf6302c9-3377-4a44-bf25-6acf56bb86d9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.306462 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf6302c9-3377-4a44-bf25-6acf56bb86d9" (UID: "cf6302c9-3377-4a44-bf25-6acf56bb86d9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.307012 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r" (OuterVolumeSpecName: "kube-api-access-2kc7r") pod "cf6302c9-3377-4a44-bf25-6acf56bb86d9" (UID: "cf6302c9-3377-4a44-bf25-6acf56bb86d9"). InnerVolumeSpecName "kube-api-access-2kc7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.402010 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf6302c9-3377-4a44-bf25-6acf56bb86d9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.402050 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf6302c9-3377-4a44-bf25-6acf56bb86d9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.402062 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kc7r\" (UniqueName: \"kubernetes.io/projected/cf6302c9-3377-4a44-bf25-6acf56bb86d9-kube-api-access-2kc7r\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.973455 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" event={"ID":"cf6302c9-3377-4a44-bf25-6acf56bb86d9","Type":"ContainerDied","Data":"dab47014c467d251452a1d41965bd504d31530f77efbf8b018b8c4ed9a726b6e"} Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.973495 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab47014c467d251452a1d41965bd504d31530f77efbf8b018b8c4ed9a726b6e" Mar 20 12:30:03 crc kubenswrapper[4817]: I0320 12:30:03.973546 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566830-nbdvm" Mar 20 12:30:04 crc kubenswrapper[4817]: I0320 12:30:04.609497 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 12:30:04 crc kubenswrapper[4817]: I0320 12:30:04.693843 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 12:30:05 crc kubenswrapper[4817]: I0320 12:30:05.310780 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:05 crc kubenswrapper[4817]: I0320 12:30:05.310859 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:05 crc kubenswrapper[4817]: I0320 12:30:05.475402 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:05 crc kubenswrapper[4817]: I0320 12:30:05.514274 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.5142507859999998 podStartE2EDuration="1.514250786s" podCreationTimestamp="2026-03-20 12:30:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:30:05.511937109 +0000 UTC m=+167.600249902" watchObservedRunningTime="2026-03-20 12:30:05.514250786 +0000 UTC m=+167.602563569" Mar 20 12:30:06 crc kubenswrapper[4817]: I0320 12:30:06.027708 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:06 crc kubenswrapper[4817]: I0320 12:30:06.110421 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:30:07 crc kubenswrapper[4817]: I0320 12:30:07.424683 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:07 crc kubenswrapper[4817]: I0320 12:30:07.425153 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:07 crc kubenswrapper[4817]: I0320 12:30:07.486355 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:07 crc kubenswrapper[4817]: I0320 12:30:07.996224 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjbdq" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="registry-server" containerID="cri-o://2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31" gracePeriod=2 Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.038272 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.508619 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.890207 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.984107 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities\") pod \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.984227 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content\") pod \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.985394 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l62f2\" (UniqueName: \"kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2\") pod \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\" (UID: \"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9\") " Mar 20 12:30:08 crc kubenswrapper[4817]: I0320 12:30:08.985967 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities" (OuterVolumeSpecName: "utilities") pod "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" (UID: "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.004558 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2" (OuterVolumeSpecName: "kube-api-access-l62f2") pod "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" (UID: "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9"). InnerVolumeSpecName "kube-api-access-l62f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.008319 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" event={"ID":"a41c54b1-d599-494b-8476-a2700bb5424f","Type":"ContainerStarted","Data":"670f2c27e30d2b7d61efd988cdf39c04303f7f99219802767f9e2ea2f89fb907"} Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.016659 4817 generic.go:334] "Generic (PLEG): container finished" podID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerID="2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31" exitCode=0 Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.016714 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerDied","Data":"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31"} Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.016736 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjbdq" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.016764 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjbdq" event={"ID":"7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9","Type":"ContainerDied","Data":"06d34d7c10958f4e924fc0ef3ee77766aa5bd4c935967d9d274be0ddfc1da8ae"} Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.016781 4817 scope.go:117] "RemoveContainer" containerID="2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.031694 4817 scope.go:117] "RemoveContainer" containerID="645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.046769 4817 scope.go:117] "RemoveContainer" containerID="5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.062936 4817 scope.go:117] "RemoveContainer" containerID="2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31" Mar 20 12:30:09 crc kubenswrapper[4817]: E0320 12:30:09.063447 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31\": container with ID starting with 2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31 not found: ID does not exist" containerID="2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.063491 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31"} err="failed to get container status \"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31\": rpc error: code = NotFound desc = could not find container \"2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31\": container with ID starting with 2f9f20cb3abdd8453a9773ff3c869181f48619407bc39bd78278fb7bc1cf4e31 not found: ID does not exist" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.063525 4817 scope.go:117] "RemoveContainer" containerID="645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf" Mar 20 12:30:09 crc kubenswrapper[4817]: E0320 12:30:09.063871 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf\": container with ID starting with 645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf not found: ID does not exist" containerID="645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.063907 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf"} err="failed to get container status \"645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf\": rpc error: code = NotFound desc = could not find container \"645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf\": container with ID starting with 645ae9ad4b0ed0cef6aab7da0e164b7f357446dd381a2ff144b9139cf06ae5bf not found: ID does not exist" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.063928 4817 scope.go:117] "RemoveContainer" containerID="5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87" Mar 20 12:30:09 crc kubenswrapper[4817]: E0320 12:30:09.064405 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87\": container with ID starting with 5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87 not found: ID does not exist" containerID="5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.064437 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87"} err="failed to get container status \"5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87\": rpc error: code = NotFound desc = could not find container \"5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87\": container with ID starting with 5f6323f249534e5cbc3405b0650e9d332cb5bfb6a8cd121abe417e9ab9aeaf87 not found: ID does not exist" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.087032 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l62f2\" (UniqueName: \"kubernetes.io/projected/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-kube-api-access-l62f2\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.087061 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.312465 4817 csr.go:261] certificate signing request csr-bqpqn is approved, waiting to be issued Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.318379 4817 csr.go:257] certificate signing request csr-bqpqn is issued Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.516346 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" (UID: "7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.532912 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.642435 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:30:09 crc kubenswrapper[4817]: I0320 12:30:09.648319 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjbdq"] Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.025043 4817 generic.go:334] "Generic (PLEG): container finished" podID="a41c54b1-d599-494b-8476-a2700bb5424f" containerID="670f2c27e30d2b7d61efd988cdf39c04303f7f99219802767f9e2ea2f89fb907" exitCode=0 Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.025097 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" event={"ID":"a41c54b1-d599-494b-8476-a2700bb5424f","Type":"ContainerDied","Data":"670f2c27e30d2b7d61efd988cdf39c04303f7f99219802767f9e2ea2f89fb907"} Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.027287 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tqqkf" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="registry-server" containerID="cri-o://7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672" gracePeriod=2 Mar 20 12:30:10 crc kubenswrapper[4817]: E0320 12:30:10.081580 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd837d56_d9f1_4396_975c_5917f6f32bc9.slice/crio-7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672.scope\": RecentStats: unable to find data in memory cache]" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.320045 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 20:18:14.675357942 +0000 UTC Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.320076 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6247h48m4.355284392s for next certificate rotation Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.472542 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.553312 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities\") pod \"bd837d56-d9f1-4396-975c-5917f6f32bc9\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.553395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnb57\" (UniqueName: \"kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57\") pod \"bd837d56-d9f1-4396-975c-5917f6f32bc9\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.553420 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content\") pod \"bd837d56-d9f1-4396-975c-5917f6f32bc9\" (UID: \"bd837d56-d9f1-4396-975c-5917f6f32bc9\") " Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.554550 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities" (OuterVolumeSpecName: "utilities") pod "bd837d56-d9f1-4396-975c-5917f6f32bc9" (UID: "bd837d56-d9f1-4396-975c-5917f6f32bc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.559937 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57" (OuterVolumeSpecName: "kube-api-access-bnb57") pod "bd837d56-d9f1-4396-975c-5917f6f32bc9" (UID: "bd837d56-d9f1-4396-975c-5917f6f32bc9"). InnerVolumeSpecName "kube-api-access-bnb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.586616 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd837d56-d9f1-4396-975c-5917f6f32bc9" (UID: "bd837d56-d9f1-4396-975c-5917f6f32bc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.654490 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.654536 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnb57\" (UniqueName: \"kubernetes.io/projected/bd837d56-d9f1-4396-975c-5917f6f32bc9-kube-api-access-bnb57\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.654550 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd837d56-d9f1-4396-975c-5917f6f32bc9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:10 crc kubenswrapper[4817]: I0320 12:30:10.674812 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" path="/var/lib/kubelet/pods/7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9/volumes" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.035459 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerID="7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672" exitCode=0 Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.035530 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqqkf" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.035549 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerDied","Data":"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672"} Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.035980 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqqkf" event={"ID":"bd837d56-d9f1-4396-975c-5917f6f32bc9","Type":"ContainerDied","Data":"c3207e1eb6eab0ee10293971afdbbd0635c3767a9687beb575f071166d80b548"} Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.036022 4817 scope.go:117] "RemoveContainer" containerID="7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.040045 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerStarted","Data":"15c5e7233dfcf3f9bb645402d51c258820e0c0d294e0a80990824124f0833118"} Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.058319 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.061615 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqqkf"] Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.062685 4817 scope.go:117] "RemoveContainer" containerID="43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.088276 4817 scope.go:117] "RemoveContainer" containerID="63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.114302 4817 scope.go:117] "RemoveContainer" containerID="7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672" Mar 20 12:30:11 crc kubenswrapper[4817]: E0320 12:30:11.114890 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672\": container with ID starting with 7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672 not found: ID does not exist" containerID="7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.114931 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672"} err="failed to get container status \"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672\": rpc error: code = NotFound desc = could not find container \"7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672\": container with ID starting with 7e43bf2f844a366b1a276ff30bf1d2c0bec23103c7fa576f5e98893fdda9e672 not found: ID does not exist" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.114964 4817 scope.go:117] "RemoveContainer" containerID="43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86" Mar 20 12:30:11 crc kubenswrapper[4817]: E0320 12:30:11.115768 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86\": container with ID starting with 43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86 not found: ID does not exist" containerID="43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.115806 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86"} err="failed to get container status \"43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86\": rpc error: code = NotFound desc = could not find container \"43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86\": container with ID starting with 43aa945367b972f7675e04316818ca6764a073746e9b97df2c970e921c08ba86 not found: ID does not exist" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.115835 4817 scope.go:117] "RemoveContainer" containerID="63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558" Mar 20 12:30:11 crc kubenswrapper[4817]: E0320 12:30:11.116261 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558\": container with ID starting with 63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558 not found: ID does not exist" containerID="63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.116318 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558"} err="failed to get container status \"63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558\": rpc error: code = NotFound desc = could not find container \"63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558\": container with ID starting with 63e3829bc22d184ab6399dd41648568db1d6cf5e16f4f4abd1e3fa8a51323558 not found: ID does not exist" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.320682 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 00:00:09.542501874 +0000 UTC Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.320738 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7139h29m58.2217695s for next certificate rotation Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.414678 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.466706 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgjn6\" (UniqueName: \"kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6\") pod \"a41c54b1-d599-494b-8476-a2700bb5424f\" (UID: \"a41c54b1-d599-494b-8476-a2700bb5424f\") " Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.473204 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6" (OuterVolumeSpecName: "kube-api-access-rgjn6") pod "a41c54b1-d599-494b-8476-a2700bb5424f" (UID: "a41c54b1-d599-494b-8476-a2700bb5424f"). InnerVolumeSpecName "kube-api-access-rgjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:11 crc kubenswrapper[4817]: I0320 12:30:11.568660 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgjn6\" (UniqueName: \"kubernetes.io/projected/a41c54b1-d599-494b-8476-a2700bb5424f-kube-api-access-rgjn6\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.051553 4817 generic.go:334] "Generic (PLEG): container finished" podID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerID="33bd90d283bfacaa66e5d5c5ec21c5b6bff7eb5df7023e0b6cdc987c159411e5" exitCode=0 Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.051615 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerDied","Data":"33bd90d283bfacaa66e5d5c5ec21c5b6bff7eb5df7023e0b6cdc987c159411e5"} Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.053683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" event={"ID":"a41c54b1-d599-494b-8476-a2700bb5424f","Type":"ContainerDied","Data":"9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581"} Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.053712 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb3bad0778657a03f25df1d3b17335f779d13f022f1f84ec96b3a6cbe5e5581" Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.053710 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566830-tj7ps" Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.056184 4817 generic.go:334] "Generic (PLEG): container finished" podID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerID="15c5e7233dfcf3f9bb645402d51c258820e0c0d294e0a80990824124f0833118" exitCode=0 Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.056284 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerDied","Data":"15c5e7233dfcf3f9bb645402d51c258820e0c0d294e0a80990824124f0833118"} Mar 20 12:30:12 crc kubenswrapper[4817]: I0320 12:30:12.672711 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" path="/var/lib/kubelet/pods/bd837d56-d9f1-4396-975c-5917f6f32bc9/volumes" Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.067515 4817 generic.go:334] "Generic (PLEG): container finished" podID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerID="9700e09fcd2f7975e31e92f09f9a347557d38d9caa6d51d222e5629fa3d34ced" exitCode=0 Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.067566 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerDied","Data":"9700e09fcd2f7975e31e92f09f9a347557d38d9caa6d51d222e5629fa3d34ced"} Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.071237 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerStarted","Data":"748c734a98db313973b466dd100361a9067d454ef30967160927628cd637d66c"} Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.075847 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerStarted","Data":"df766725add0660051b212b51484cfb302e029af7d1677e5c4896c51456e3128"} Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.078484 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerStarted","Data":"bab84a0e1242105f894986d944df5eee95184f1940278823d4192f7776ad0746"} Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.119364 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7zb6t" podStartSLOduration=2.8160576920000002 podStartE2EDuration="56.119336444s" podCreationTimestamp="2026-03-20 12:29:17 +0000 UTC" firstStartedPulling="2026-03-20 12:29:19.191695058 +0000 UTC m=+121.280007841" lastFinishedPulling="2026-03-20 12:30:12.49497381 +0000 UTC m=+174.583286593" observedRunningTime="2026-03-20 12:30:13.116551663 +0000 UTC m=+175.204864446" watchObservedRunningTime="2026-03-20 12:30:13.119336444 +0000 UTC m=+175.207649227" Mar 20 12:30:13 crc kubenswrapper[4817]: I0320 12:30:13.135948 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8cgvz" podStartSLOduration=2.104702682 podStartE2EDuration="57.135927315s" podCreationTimestamp="2026-03-20 12:29:16 +0000 UTC" firstStartedPulling="2026-03-20 12:29:17.872990534 +0000 UTC m=+119.961303317" lastFinishedPulling="2026-03-20 12:30:12.904215167 +0000 UTC m=+174.992527950" observedRunningTime="2026-03-20 12:30:13.134663218 +0000 UTC m=+175.222976001" watchObservedRunningTime="2026-03-20 12:30:13.135927315 +0000 UTC m=+175.224240098" Mar 20 12:30:14 crc kubenswrapper[4817]: I0320 12:30:14.085392 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerStarted","Data":"6cbbc4f11ddc84d546f7bf97b32e3b3a4a89881e4c62cedee401dd8b238a9bf2"} Mar 20 12:30:14 crc kubenswrapper[4817]: I0320 12:30:14.086898 4817 generic.go:334] "Generic (PLEG): container finished" podID="430036d2-3961-4715-b47b-c7d670e2ee26" containerID="df766725add0660051b212b51484cfb302e029af7d1677e5c4896c51456e3128" exitCode=0 Mar 20 12:30:14 crc kubenswrapper[4817]: I0320 12:30:14.086933 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerDied","Data":"df766725add0660051b212b51484cfb302e029af7d1677e5c4896c51456e3128"} Mar 20 12:30:14 crc kubenswrapper[4817]: I0320 12:30:14.109909 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sg4sx" podStartSLOduration=4.511430446 podStartE2EDuration="1m0.109891335s" podCreationTimestamp="2026-03-20 12:29:14 +0000 UTC" firstStartedPulling="2026-03-20 12:29:17.908158734 +0000 UTC m=+119.996471517" lastFinishedPulling="2026-03-20 12:30:13.506619623 +0000 UTC m=+175.594932406" observedRunningTime="2026-03-20 12:30:14.108520015 +0000 UTC m=+176.196832818" watchObservedRunningTime="2026-03-20 12:30:14.109891335 +0000 UTC m=+176.198204118" Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.093844 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerStarted","Data":"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4"} Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.096308 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerStarted","Data":"0a675bf20f679ec802c7d556ea2142718794f40f1403c1dd1b9fc2165b250629"} Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.099415 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerStarted","Data":"8a01b3692710156b1f84225f87bf78ed9b128ef6738cff3d8998dd7f5ee93ce9"} Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.118228 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.118274 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:30:15 crc kubenswrapper[4817]: I0320 12:30:15.133527 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j47ss" podStartSLOduration=2.611574926 podStartE2EDuration="57.133509075s" podCreationTimestamp="2026-03-20 12:29:18 +0000 UTC" firstStartedPulling="2026-03-20 12:29:20.296974614 +0000 UTC m=+122.385287397" lastFinishedPulling="2026-03-20 12:30:14.818908763 +0000 UTC m=+176.907221546" observedRunningTime="2026-03-20 12:30:15.131428235 +0000 UTC m=+177.219741018" watchObservedRunningTime="2026-03-20 12:30:15.133509075 +0000 UTC m=+177.221821858" Mar 20 12:30:16 crc kubenswrapper[4817]: I0320 12:30:16.105851 4817 generic.go:334] "Generic (PLEG): container finished" podID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerID="3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4" exitCode=0 Mar 20 12:30:16 crc kubenswrapper[4817]: I0320 12:30:16.105944 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerDied","Data":"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4"} Mar 20 12:30:16 crc kubenswrapper[4817]: I0320 12:30:16.109164 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerDied","Data":"0a675bf20f679ec802c7d556ea2142718794f40f1403c1dd1b9fc2165b250629"} Mar 20 12:30:16 crc kubenswrapper[4817]: I0320 12:30:16.109061 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerID="0a675bf20f679ec802c7d556ea2142718794f40f1403c1dd1b9fc2165b250629" exitCode=0 Mar 20 12:30:16 crc kubenswrapper[4817]: I0320 12:30:16.161809 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sg4sx" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="registry-server" probeResult="failure" output=< Mar 20 12:30:16 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Mar 20 12:30:16 crc kubenswrapper[4817]: > Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.012616 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.013038 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.062982 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.117226 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerStarted","Data":"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6"} Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.120183 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerStarted","Data":"5980199afc2469b92fe453d8c2512860922e1b8b816e0392cbd8e9ebfc8c8870"} Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.135762 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fb87f" podStartSLOduration=3.413357739 podStartE2EDuration="1m2.135743129s" podCreationTimestamp="2026-03-20 12:29:15 +0000 UTC" firstStartedPulling="2026-03-20 12:29:17.85743461 +0000 UTC m=+119.945747393" lastFinishedPulling="2026-03-20 12:30:16.57982 +0000 UTC m=+178.668132783" observedRunningTime="2026-03-20 12:30:17.135587314 +0000 UTC m=+179.223900117" watchObservedRunningTime="2026-03-20 12:30:17.135743129 +0000 UTC m=+179.224055922" Mar 20 12:30:17 crc kubenswrapper[4817]: I0320 12:30:17.158025 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-52mbr" podStartSLOduration=4.455808012 podStartE2EDuration="1m3.158003834s" podCreationTimestamp="2026-03-20 12:29:14 +0000 UTC" firstStartedPulling="2026-03-20 12:29:17.899657177 +0000 UTC m=+119.987969960" lastFinishedPulling="2026-03-20 12:30:16.601852999 +0000 UTC m=+178.690165782" observedRunningTime="2026-03-20 12:30:17.15647423 +0000 UTC m=+179.244787013" watchObservedRunningTime="2026-03-20 12:30:17.158003834 +0000 UTC m=+179.246316617" Mar 20 12:30:18 crc kubenswrapper[4817]: I0320 12:30:18.001200 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:30:18 crc kubenswrapper[4817]: I0320 12:30:18.001261 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:30:18 crc kubenswrapper[4817]: I0320 12:30:18.404522 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:18 crc kubenswrapper[4817]: I0320 12:30:18.404579 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:19 crc kubenswrapper[4817]: I0320 12:30:19.043251 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7zb6t" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="registry-server" probeResult="failure" output=< Mar 20 12:30:19 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Mar 20 12:30:19 crc kubenswrapper[4817]: > Mar 20 12:30:19 crc kubenswrapper[4817]: I0320 12:30:19.442278 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j47ss" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="registry-server" probeResult="failure" output=< Mar 20 12:30:19 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Mar 20 12:30:19 crc kubenswrapper[4817]: > Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.141786 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.142382 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.176590 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.219918 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.233780 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.285173 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.524883 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.525325 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:25 crc kubenswrapper[4817]: I0320 12:30:25.591715 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:26 crc kubenswrapper[4817]: I0320 12:30:26.217313 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:27 crc kubenswrapper[4817]: I0320 12:30:27.086492 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:30:28 crc kubenswrapper[4817]: I0320 12:30:28.019733 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:30:28 crc kubenswrapper[4817]: I0320 12:30:28.074565 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:30:28 crc kubenswrapper[4817]: I0320 12:30:28.147880 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:30:28 crc kubenswrapper[4817]: I0320 12:30:28.473685 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:28 crc kubenswrapper[4817]: I0320 12:30:28.530975 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:29 crc kubenswrapper[4817]: I0320 12:30:29.245794 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fb87f" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="registry-server" containerID="cri-o://df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6" gracePeriod=2 Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.242887 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.254031 4817 generic.go:334] "Generic (PLEG): container finished" podID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerID="df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6" exitCode=0 Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.254077 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerDied","Data":"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6"} Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.254111 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fb87f" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.254154 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fb87f" event={"ID":"870bce80-0a37-4d5f-afbf-ffa85ea34a03","Type":"ContainerDied","Data":"1bc22e3c69c1b77b2052a1139281a6650ff94a22d4dd5765131a8f823d5a35d9"} Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.254177 4817 scope.go:117] "RemoveContainer" containerID="df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.288454 4817 scope.go:117] "RemoveContainer" containerID="3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.305530 4817 scope.go:117] "RemoveContainer" containerID="ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.333023 4817 scope.go:117] "RemoveContainer" containerID="df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6" Mar 20 12:30:30 crc kubenswrapper[4817]: E0320 12:30:30.333627 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6\": container with ID starting with df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6 not found: ID does not exist" containerID="df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.333706 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6"} err="failed to get container status \"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6\": rpc error: code = NotFound desc = could not find container \"df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6\": container with ID starting with df88e31d7bcbb8df54ab4a13a83a833c6e4b74bbe61c2f23d369db3a6e57f9a6 not found: ID does not exist" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.333749 4817 scope.go:117] "RemoveContainer" containerID="3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4" Mar 20 12:30:30 crc kubenswrapper[4817]: E0320 12:30:30.334075 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4\": container with ID starting with 3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4 not found: ID does not exist" containerID="3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.334104 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4"} err="failed to get container status \"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4\": rpc error: code = NotFound desc = could not find container \"3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4\": container with ID starting with 3a1284d1b74da260f96ec89f897371c7619879cf2fe1b0e9fa72345512f5fdc4 not found: ID does not exist" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.334142 4817 scope.go:117] "RemoveContainer" containerID="ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe" Mar 20 12:30:30 crc kubenswrapper[4817]: E0320 12:30:30.334361 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe\": container with ID starting with ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe not found: ID does not exist" containerID="ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.334383 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe"} err="failed to get container status \"ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe\": rpc error: code = NotFound desc = could not find container \"ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe\": container with ID starting with ff4c0202065bb873ef8fb1228a50974593532c1b6948c3cf49560f16e388afbe not found: ID does not exist" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.367498 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities\") pod \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.367584 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x69fn\" (UniqueName: \"kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn\") pod \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.367673 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content\") pod \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\" (UID: \"870bce80-0a37-4d5f-afbf-ffa85ea34a03\") " Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.369237 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities" (OuterVolumeSpecName: "utilities") pod "870bce80-0a37-4d5f-afbf-ffa85ea34a03" (UID: "870bce80-0a37-4d5f-afbf-ffa85ea34a03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.387384 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn" (OuterVolumeSpecName: "kube-api-access-x69fn") pod "870bce80-0a37-4d5f-afbf-ffa85ea34a03" (UID: "870bce80-0a37-4d5f-afbf-ffa85ea34a03"). InnerVolumeSpecName "kube-api-access-x69fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.426271 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "870bce80-0a37-4d5f-afbf-ffa85ea34a03" (UID: "870bce80-0a37-4d5f-afbf-ffa85ea34a03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.469189 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.469231 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x69fn\" (UniqueName: \"kubernetes.io/projected/870bce80-0a37-4d5f-afbf-ffa85ea34a03-kube-api-access-x69fn\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.469250 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870bce80-0a37-4d5f-afbf-ffa85ea34a03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.582683 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.586268 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fb87f"] Mar 20 12:30:30 crc kubenswrapper[4817]: I0320 12:30:30.669859 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" path="/var/lib/kubelet/pods/870bce80-0a37-4d5f-afbf-ffa85ea34a03/volumes" Mar 20 12:30:31 crc kubenswrapper[4817]: I0320 12:30:31.710277 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v6wkw"] Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.301605 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.302368 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" podUID="4ca774b2-cbbd-4db9-82a9-820415902c3c" containerName="controller-manager" containerID="cri-o://da7efccbc71abef4be4287757dc2df3ca27b09c8601ce8a7ea1c610e2ee822de" gracePeriod=30 Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.404436 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.404680 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" podUID="519193a3-0c3a-4736-a169-233cc2d89766" containerName="route-controller-manager" containerID="cri-o://f13a95fbd986182e392491f10b939785a86d6112916ffeafa183c84cc7164d4a" gracePeriod=30 Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.411851 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:30:32 crc kubenswrapper[4817]: I0320 12:30:32.412154 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j47ss" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="registry-server" containerID="cri-o://8a01b3692710156b1f84225f87bf78ed9b128ef6738cff3d8998dd7f5ee93ce9" gracePeriod=2 Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.282366 4817 generic.go:334] "Generic (PLEG): container finished" podID="519193a3-0c3a-4736-a169-233cc2d89766" containerID="f13a95fbd986182e392491f10b939785a86d6112916ffeafa183c84cc7164d4a" exitCode=0 Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.282531 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" event={"ID":"519193a3-0c3a-4736-a169-233cc2d89766","Type":"ContainerDied","Data":"f13a95fbd986182e392491f10b939785a86d6112916ffeafa183c84cc7164d4a"} Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.284344 4817 generic.go:334] "Generic (PLEG): container finished" podID="430036d2-3961-4715-b47b-c7d670e2ee26" containerID="8a01b3692710156b1f84225f87bf78ed9b128ef6738cff3d8998dd7f5ee93ce9" exitCode=0 Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.284382 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerDied","Data":"8a01b3692710156b1f84225f87bf78ed9b128ef6738cff3d8998dd7f5ee93ce9"} Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.285455 4817 generic.go:334] "Generic (PLEG): container finished" podID="4ca774b2-cbbd-4db9-82a9-820415902c3c" containerID="da7efccbc71abef4be4287757dc2df3ca27b09c8601ce8a7ea1c610e2ee822de" exitCode=0 Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.285580 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" event={"ID":"4ca774b2-cbbd-4db9-82a9-820415902c3c","Type":"ContainerDied","Data":"da7efccbc71abef4be4287757dc2df3ca27b09c8601ce8a7ea1c610e2ee822de"} Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.513767 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542633 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542837 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542852 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542862 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542868 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542876 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542882 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542891 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542897 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542908 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542915 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542926 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" containerName="pruner" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542933 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" containerName="pruner" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542945 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542951 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542957 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542963 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="extract-content" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542975 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542981 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.542990 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41c54b1-d599-494b-8476-a2700bb5424f" containerName="oc" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.542995 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41c54b1-d599-494b-8476-a2700bb5424f" containerName="oc" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.543003 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543009 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="extract-utilities" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.543019 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6302c9-3377-4a44-bf25-6acf56bb86d9" containerName="collect-profiles" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543024 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6302c9-3377-4a44-bf25-6acf56bb86d9" containerName="collect-profiles" Mar 20 12:30:33 crc kubenswrapper[4817]: E0320 12:30:33.543046 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519193a3-0c3a-4736-a169-233cc2d89766" containerName="route-controller-manager" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543052 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="519193a3-0c3a-4736-a169-233cc2d89766" containerName="route-controller-manager" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543148 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa7f39b-32d2-4f33-84c2-7cbe67aaf8a9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543158 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd869c8-5c07-4b3d-b46b-11c21ddf7e06" containerName="pruner" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543169 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="870bce80-0a37-4d5f-afbf-ffa85ea34a03" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543177 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd837d56-d9f1-4396-975c-5917f6f32bc9" containerName="registry-server" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543185 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6302c9-3377-4a44-bf25-6acf56bb86d9" containerName="collect-profiles" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543193 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="519193a3-0c3a-4736-a169-233cc2d89766" containerName="route-controller-manager" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543202 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41c54b1-d599-494b-8476-a2700bb5424f" containerName="oc" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.543566 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.557596 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.567487 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.612608 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdnx\" (UniqueName: \"kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx\") pod \"519193a3-0c3a-4736-a169-233cc2d89766\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.612793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca\") pod \"519193a3-0c3a-4736-a169-233cc2d89766\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.612866 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config\") pod \"519193a3-0c3a-4736-a169-233cc2d89766\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.612913 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert\") pod \"519193a3-0c3a-4736-a169-233cc2d89766\" (UID: \"519193a3-0c3a-4736-a169-233cc2d89766\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.614024 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca" (OuterVolumeSpecName: "client-ca") pod "519193a3-0c3a-4736-a169-233cc2d89766" (UID: "519193a3-0c3a-4736-a169-233cc2d89766"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.615027 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config" (OuterVolumeSpecName: "config") pod "519193a3-0c3a-4736-a169-233cc2d89766" (UID: "519193a3-0c3a-4736-a169-233cc2d89766"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.615515 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.619609 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "519193a3-0c3a-4736-a169-233cc2d89766" (UID: "519193a3-0c3a-4736-a169-233cc2d89766"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.631327 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx" (OuterVolumeSpecName: "kube-api-access-sbdnx") pod "519193a3-0c3a-4736-a169-233cc2d89766" (UID: "519193a3-0c3a-4736-a169-233cc2d89766"). InnerVolumeSpecName "kube-api-access-sbdnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.717596 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content\") pod \"430036d2-3961-4715-b47b-c7d670e2ee26\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.717657 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm6cp\" (UniqueName: \"kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp\") pod \"430036d2-3961-4715-b47b-c7d670e2ee26\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.717696 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities\") pod \"430036d2-3961-4715-b47b-c7d670e2ee26\" (UID: \"430036d2-3961-4715-b47b-c7d670e2ee26\") " Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718056 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718096 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxt2\" (UniqueName: \"kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718147 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718186 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718246 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbdnx\" (UniqueName: \"kubernetes.io/projected/519193a3-0c3a-4736-a169-233cc2d89766-kube-api-access-sbdnx\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718260 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/519193a3-0c3a-4736-a169-233cc2d89766-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.718272 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/519193a3-0c3a-4736-a169-233cc2d89766-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.721293 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities" (OuterVolumeSpecName: "utilities") pod "430036d2-3961-4715-b47b-c7d670e2ee26" (UID: "430036d2-3961-4715-b47b-c7d670e2ee26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.722306 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp" (OuterVolumeSpecName: "kube-api-access-mm6cp") pod "430036d2-3961-4715-b47b-c7d670e2ee26" (UID: "430036d2-3961-4715-b47b-c7d670e2ee26"). InnerVolumeSpecName "kube-api-access-mm6cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822429 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822485 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxt2\" (UniqueName: \"kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822508 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822565 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822824 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm6cp\" (UniqueName: \"kubernetes.io/projected/430036d2-3961-4715-b47b-c7d670e2ee26-kube-api-access-mm6cp\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.822840 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.827579 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.827923 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.832202 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.842168 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxt2\" (UniqueName: \"kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2\") pod \"route-controller-manager-659bc48566-4mkc7\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.865744 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "430036d2-3961-4715-b47b-c7d670e2ee26" (UID: "430036d2-3961-4715-b47b-c7d670e2ee26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.871877 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.923848 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/430036d2-3961-4715-b47b-c7d670e2ee26-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:33 crc kubenswrapper[4817]: I0320 12:30:33.966972 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.126166 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmphp\" (UniqueName: \"kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp\") pod \"4ca774b2-cbbd-4db9-82a9-820415902c3c\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.126627 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles\") pod \"4ca774b2-cbbd-4db9-82a9-820415902c3c\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.126720 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca\") pod \"4ca774b2-cbbd-4db9-82a9-820415902c3c\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.126779 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config\") pod \"4ca774b2-cbbd-4db9-82a9-820415902c3c\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.126806 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert\") pod \"4ca774b2-cbbd-4db9-82a9-820415902c3c\" (UID: \"4ca774b2-cbbd-4db9-82a9-820415902c3c\") " Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.128769 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ca774b2-cbbd-4db9-82a9-820415902c3c" (UID: "4ca774b2-cbbd-4db9-82a9-820415902c3c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.128852 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ca774b2-cbbd-4db9-82a9-820415902c3c" (UID: "4ca774b2-cbbd-4db9-82a9-820415902c3c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.129147 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config" (OuterVolumeSpecName: "config") pod "4ca774b2-cbbd-4db9-82a9-820415902c3c" (UID: "4ca774b2-cbbd-4db9-82a9-820415902c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.143805 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ca774b2-cbbd-4db9-82a9-820415902c3c" (UID: "4ca774b2-cbbd-4db9-82a9-820415902c3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.143880 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp" (OuterVolumeSpecName: "kube-api-access-qmphp") pod "4ca774b2-cbbd-4db9-82a9-820415902c3c" (UID: "4ca774b2-cbbd-4db9-82a9-820415902c3c"). InnerVolumeSpecName "kube-api-access-qmphp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.228351 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.228389 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.228400 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca774b2-cbbd-4db9-82a9-820415902c3c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.228414 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmphp\" (UniqueName: \"kubernetes.io/projected/4ca774b2-cbbd-4db9-82a9-820415902c3c-kube-api-access-qmphp\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.228428 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca774b2-cbbd-4db9-82a9-820415902c3c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.291996 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.294971 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv" event={"ID":"519193a3-0c3a-4736-a169-233cc2d89766","Type":"ContainerDied","Data":"632822be3abb0f66f93c309bc7d1fc01186230fd86969c581ad0ab09d3fa499d"} Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.295022 4817 scope.go:117] "RemoveContainer" containerID="f13a95fbd986182e392491f10b939785a86d6112916ffeafa183c84cc7164d4a" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.298055 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j47ss" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.298059 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j47ss" event={"ID":"430036d2-3961-4715-b47b-c7d670e2ee26","Type":"ContainerDied","Data":"0bbc360d2aebf4e2f930d24633eda829b9f63a75ae6148054ef6f3f782ebde93"} Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.304717 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" event={"ID":"4ca774b2-cbbd-4db9-82a9-820415902c3c","Type":"ContainerDied","Data":"78ac0dd30d2c99ec53ceb6f9453cf39bf8765a392e8455eeeeba8fb738869ebc"} Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.304857 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5979884754-x9xxh" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.319895 4817 scope.go:117] "RemoveContainer" containerID="8a01b3692710156b1f84225f87bf78ed9b128ef6738cff3d8998dd7f5ee93ce9" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.331994 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.340093 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86956f7bdb-pxfnv"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.341148 4817 scope.go:117] "RemoveContainer" containerID="df766725add0660051b212b51484cfb302e029af7d1677e5c4896c51456e3128" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.342613 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.346045 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j47ss"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.361243 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.362453 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5979884754-x9xxh"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.369512 4817 scope.go:117] "RemoveContainer" containerID="2bd411f8722e0bdb34167d5b6d5779b54a75851d87cb8671f4857a18f5d4ea74" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.384359 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.392261 4817 scope.go:117] "RemoveContainer" containerID="da7efccbc71abef4be4287757dc2df3ca27b09c8601ce8a7ea1c610e2ee822de" Mar 20 12:30:34 crc kubenswrapper[4817]: W0320 12:30:34.396903 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6a3ae8_1cee_4790_8fa4_37a7a613ed97.slice/crio-587bd707e75526960ae5ed29817a23ab9805e3b89fc29ebbe3a8316fa4d2e804 WatchSource:0}: Error finding container 587bd707e75526960ae5ed29817a23ab9805e3b89fc29ebbe3a8316fa4d2e804: Status 404 returned error can't find the container with id 587bd707e75526960ae5ed29817a23ab9805e3b89fc29ebbe3a8316fa4d2e804 Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.671806 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" path="/var/lib/kubelet/pods/430036d2-3961-4715-b47b-c7d670e2ee26/volumes" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.672664 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca774b2-cbbd-4db9-82a9-820415902c3c" path="/var/lib/kubelet/pods/4ca774b2-cbbd-4db9-82a9-820415902c3c/volumes" Mar 20 12:30:34 crc kubenswrapper[4817]: I0320 12:30:34.673470 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519193a3-0c3a-4736-a169-233cc2d89766" path="/var/lib/kubelet/pods/519193a3-0c3a-4736-a169-233cc2d89766/volumes" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.323012 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" event={"ID":"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97","Type":"ContainerStarted","Data":"d21617e509c20a59e0330791ce1d106edfa21ed325153fdc8fbf6fea1501b338"} Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.323350 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" event={"ID":"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97","Type":"ContainerStarted","Data":"587bd707e75526960ae5ed29817a23ab9805e3b89fc29ebbe3a8316fa4d2e804"} Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.323899 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.328977 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.347969 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" podStartSLOduration=3.347947779 podStartE2EDuration="3.347947779s" podCreationTimestamp="2026-03-20 12:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:30:35.34198175 +0000 UTC m=+197.430294533" watchObservedRunningTime="2026-03-20 12:30:35.347947779 +0000 UTC m=+197.436260562" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883504 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:30:35 crc kubenswrapper[4817]: E0320 12:30:35.883702 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="extract-utilities" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883715 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="extract-utilities" Mar 20 12:30:35 crc kubenswrapper[4817]: E0320 12:30:35.883725 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="extract-content" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883732 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="extract-content" Mar 20 12:30:35 crc kubenswrapper[4817]: E0320 12:30:35.883742 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="registry-server" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883749 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="registry-server" Mar 20 12:30:35 crc kubenswrapper[4817]: E0320 12:30:35.883761 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca774b2-cbbd-4db9-82a9-820415902c3c" containerName="controller-manager" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883767 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca774b2-cbbd-4db9-82a9-820415902c3c" containerName="controller-manager" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883873 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca774b2-cbbd-4db9-82a9-820415902c3c" containerName="controller-manager" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.883889 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="430036d2-3961-4715-b47b-c7d670e2ee26" containerName="registry-server" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.884289 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.887491 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.888271 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.888511 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.890035 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.890109 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.891753 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.899156 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:30:35 crc kubenswrapper[4817]: I0320 12:30:35.903367 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.050259 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.050331 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.050358 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.050607 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhqq\" (UniqueName: \"kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.050677 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.152383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhqq\" (UniqueName: \"kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.152833 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.153720 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.153794 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.153820 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.154556 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.154647 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.155002 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.182754 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.188903 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhqq\" (UniqueName: \"kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq\") pod \"controller-manager-5d7d994-pddlv\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.200820 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:36 crc kubenswrapper[4817]: I0320 12:30:36.626316 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:30:36 crc kubenswrapper[4817]: W0320 12:30:36.640292 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f22ca62_a967_4dba_a7bd_27d16d4f2214.slice/crio-ab7832b4bf62e88ffcead90506627d73fb16fae9abbe724de41b650b6385f3dc WatchSource:0}: Error finding container ab7832b4bf62e88ffcead90506627d73fb16fae9abbe724de41b650b6385f3dc: Status 404 returned error can't find the container with id ab7832b4bf62e88ffcead90506627d73fb16fae9abbe724de41b650b6385f3dc Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.124475 4817 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.125692 4817 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.125863 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.125996 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7" gracePeriod=15 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126045 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c" gracePeriod=15 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126050 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3" gracePeriod=15 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126160 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b" gracePeriod=15 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126497 4817 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126663 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126681 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126693 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126700 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126714 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126722 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126729 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126737 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126746 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126753 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126763 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126782 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126796 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126803 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126814 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126822 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.126831 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126839 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126960 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126971 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126986 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.126995 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127003 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127014 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127023 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127036 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.127174 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127185 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.127323 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.128783 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae" gracePeriod=15 Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.192200 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270527 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270576 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270593 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270612 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270842 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270896 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.270949 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.341731 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343012 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343668 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c" exitCode=0 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343692 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3" exitCode=0 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343702 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b" exitCode=0 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343712 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae" exitCode=2 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.343768 4817 scope.go:117] "RemoveContainer" containerID="3e572dcd05d0127d887dd34d43b57ba70ca318e81366eeeed89df28e8cd63823" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.357382 4817 generic.go:334] "Generic (PLEG): container finished" podID="e36838e6-8822-4760-a052-b888667f5a14" containerID="6fdeb4ea13193a05df960c3380d31afda71a7344c20f3aeabd012396f91e8414" exitCode=0 Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.357421 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e36838e6-8822-4760-a052-b888667f5a14","Type":"ContainerDied","Data":"6fdeb4ea13193a05df960c3380d31afda71a7344c20f3aeabd012396f91e8414"} Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.358164 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.358595 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.359577 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" event={"ID":"2f22ca62-a967-4dba-a7bd-27d16d4f2214","Type":"ContainerStarted","Data":"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5"} Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.359616 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" event={"ID":"2f22ca62-a967-4dba-a7bd-27d16d4f2214","Type":"ContainerStarted","Data":"ab7832b4bf62e88ffcead90506627d73fb16fae9abbe724de41b650b6385f3dc"} Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.359881 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.360065 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.360359 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.360743 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.366302 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.366576 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.366716 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.366850 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372479 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372568 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372585 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372600 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372616 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372633 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372686 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372715 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372737 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372757 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372778 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372802 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.372850 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: I0320 12:30:37.493671 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:37 crc kubenswrapper[4817]: W0320 12:30:37.509183 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e4f496c82746ff075fbc3e6051a2514729b5512540bb6f30459de5f37b655277 WatchSource:0}: Error finding container e4f496c82746ff075fbc3e6051a2514729b5512540bb6f30459de5f37b655277: Status 404 returned error can't find the container with id e4f496c82746ff075fbc3e6051a2514729b5512540bb6f30459de5f37b655277 Mar 20 12:30:37 crc kubenswrapper[4817]: E0320 12:30:37.511664 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8c8c50e59bfe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:30:37.51108915 +0000 UTC m=+199.599401933,LastTimestamp:2026-03-20 12:30:37.51108915 +0000 UTC m=+199.599401933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.369647 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d"} Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.369723 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e4f496c82746ff075fbc3e6051a2514729b5512540bb6f30459de5f37b655277"} Mar 20 12:30:38 crc kubenswrapper[4817]: E0320 12:30:38.370714 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.370704 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.371100 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.371759 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.374381 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.665974 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.666658 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.667087 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.728385 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.729788 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.730329 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.894876 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access\") pod \"e36838e6-8822-4760-a052-b888667f5a14\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.895205 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir\") pod \"e36838e6-8822-4760-a052-b888667f5a14\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.895258 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock\") pod \"e36838e6-8822-4760-a052-b888667f5a14\" (UID: \"e36838e6-8822-4760-a052-b888667f5a14\") " Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.895710 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock" (OuterVolumeSpecName: "var-lock") pod "e36838e6-8822-4760-a052-b888667f5a14" (UID: "e36838e6-8822-4760-a052-b888667f5a14"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.896267 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e36838e6-8822-4760-a052-b888667f5a14" (UID: "e36838e6-8822-4760-a052-b888667f5a14"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.905179 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e36838e6-8822-4760-a052-b888667f5a14" (UID: "e36838e6-8822-4760-a052-b888667f5a14"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.996615 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.996651 4817 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e36838e6-8822-4760-a052-b888667f5a14-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:38 crc kubenswrapper[4817]: I0320 12:30:38.996660 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e36838e6-8822-4760-a052-b888667f5a14-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.383842 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e36838e6-8822-4760-a052-b888667f5a14","Type":"ContainerDied","Data":"81d779dddb471c2dd437bbfbbbc642362d9a0dc546d028e4d8366ace2aab8fb6"} Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.384308 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81d779dddb471c2dd437bbfbbbc642362d9a0dc546d028e4d8366ace2aab8fb6" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.383862 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.494372 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.494560 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.499683 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.500370 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.500690 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.500990 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.501395 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645099 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645235 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645249 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645338 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645408 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.645920 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.646472 4817 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.646619 4817 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:39 crc kubenswrapper[4817]: I0320 12:30:39.748172 4817 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.394895 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.395691 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7" exitCode=0 Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.395758 4817 scope.go:117] "RemoveContainer" containerID="592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.395839 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.415375 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.415731 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.416462 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.426913 4817 scope.go:117] "RemoveContainer" containerID="8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.446167 4817 scope.go:117] "RemoveContainer" containerID="09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.466272 4817 scope.go:117] "RemoveContainer" containerID="36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.481551 4817 scope.go:117] "RemoveContainer" containerID="25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.500901 4817 scope.go:117] "RemoveContainer" containerID="9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.538286 4817 scope.go:117] "RemoveContainer" containerID="592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.539303 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c\": container with ID starting with 592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c not found: ID does not exist" containerID="592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.539382 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c"} err="failed to get container status \"592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c\": rpc error: code = NotFound desc = could not find container \"592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c\": container with ID starting with 592399f70fccfe98a038e48372cb368985661c030b8b06eb2163c97e2b88be6c not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.539408 4817 scope.go:117] "RemoveContainer" containerID="8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.539719 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3\": container with ID starting with 8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3 not found: ID does not exist" containerID="8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.539817 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3"} err="failed to get container status \"8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3\": rpc error: code = NotFound desc = could not find container \"8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3\": container with ID starting with 8c0332d0f40a5f98e36bfe09d58a8f6982229f6199e3ae5f4f2fa1fb6e8cace3 not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.540041 4817 scope.go:117] "RemoveContainer" containerID="09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.540534 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b\": container with ID starting with 09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b not found: ID does not exist" containerID="09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.540591 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b"} err="failed to get container status \"09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b\": rpc error: code = NotFound desc = could not find container \"09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b\": container with ID starting with 09c8216d91ba3dba33a3aff1c79ce42139611b05a5f993d10729fde00745841b not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.540636 4817 scope.go:117] "RemoveContainer" containerID="36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.540972 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae\": container with ID starting with 36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae not found: ID does not exist" containerID="36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.541057 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae"} err="failed to get container status \"36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae\": rpc error: code = NotFound desc = could not find container \"36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae\": container with ID starting with 36002b64644d46d0a1aba799f41e35a249b2e5d763ccf6d253730f4f11860eae not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.541148 4817 scope.go:117] "RemoveContainer" containerID="25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.541669 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7\": container with ID starting with 25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7 not found: ID does not exist" containerID="25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.541979 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7"} err="failed to get container status \"25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7\": rpc error: code = NotFound desc = could not find container \"25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7\": container with ID starting with 25010709ae6e010f514efd55c5e0806531d54505cb61b091583b6b1574087ff7 not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.542061 4817 scope.go:117] "RemoveContainer" containerID="9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.542428 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759\": container with ID starting with 9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759 not found: ID does not exist" containerID="9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.542484 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759"} err="failed to get container status \"9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759\": rpc error: code = NotFound desc = could not find container \"9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759\": container with ID starting with 9038c92a5069bb2d6a27e3816489724a8a479f0f8e98d8dd474f132ec2e9f759 not found: ID does not exist" Mar 20 12:30:40 crc kubenswrapper[4817]: E0320 12:30:40.555574 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-26e96a06130cae32c3437bbe5d39afe391c0a72be855f23bac72e9cfec5bc973\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice\": RecentStats: unable to find data in memory cache]" Mar 20 12:30:40 crc kubenswrapper[4817]: I0320 12:30:40.674481 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.463484 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.466019 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.466554 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.466847 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.467221 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:41 crc kubenswrapper[4817]: I0320 12:30:41.467270 4817 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.467719 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Mar 20 12:30:41 crc kubenswrapper[4817]: E0320 12:30:41.668772 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Mar 20 12:30:42 crc kubenswrapper[4817]: E0320 12:30:42.072804 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Mar 20 12:30:42 crc kubenswrapper[4817]: E0320 12:30:42.874649 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Mar 20 12:30:44 crc kubenswrapper[4817]: E0320 12:30:44.476559 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Mar 20 12:30:45 crc kubenswrapper[4817]: E0320 12:30:45.750619 4817 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" volumeName="registry-storage" Mar 20 12:30:47 crc kubenswrapper[4817]: E0320 12:30:47.021972 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8c8c50e59bfe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 12:30:37.51108915 +0000 UTC m=+199.599401933,LastTimestamp:2026-03-20 12:30:37.51108915 +0000 UTC m=+199.599401933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 12:30:47 crc kubenswrapper[4817]: E0320 12:30:47.678446 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Mar 20 12:30:48 crc kubenswrapper[4817]: I0320 12:30:48.668531 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:48 crc kubenswrapper[4817]: I0320 12:30:48.669114 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.663354 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.664998 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.665582 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.682668 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.682818 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:50 crc kubenswrapper[4817]: E0320 12:30:50.683474 4817 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:50 crc kubenswrapper[4817]: I0320 12:30:50.684345 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:50 crc kubenswrapper[4817]: W0320 12:30:50.706908 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-895ce14873e81b3dddda939cf225d783d633a141277b8d1abc76dcdd4d27f3bf WatchSource:0}: Error finding container 895ce14873e81b3dddda939cf225d783d633a141277b8d1abc76dcdd4d27f3bf: Status 404 returned error can't find the container with id 895ce14873e81b3dddda939cf225d783d633a141277b8d1abc76dcdd4d27f3bf Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.477387 4817 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b585222250d21482fea69233499afaff1285b5945560f3f594dad806c6caa60a" exitCode=0 Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.477542 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b585222250d21482fea69233499afaff1285b5945560f3f594dad806c6caa60a"} Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.478036 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"895ce14873e81b3dddda939cf225d783d633a141277b8d1abc76dcdd4d27f3bf"} Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.478597 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.478636 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:51 crc kubenswrapper[4817]: E0320 12:30:51.479349 4817 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.479394 4817 status_manager.go:851] "Failed to get status for pod" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5d7d994-pddlv\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:51 crc kubenswrapper[4817]: I0320 12:30:51.480090 4817 status_manager.go:851] "Failed to get status for pod" podUID="e36838e6-8822-4760-a052-b888667f5a14" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.488574 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.489233 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.489283 4817 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7d9b567fdd75a25d9edddbbf2a0d25f719ee7e2adc49dd4d46449adbb3266233" exitCode=1 Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.489369 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7d9b567fdd75a25d9edddbbf2a0d25f719ee7e2adc49dd4d46449adbb3266233"} Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.490085 4817 scope.go:117] "RemoveContainer" containerID="7d9b567fdd75a25d9edddbbf2a0d25f719ee7e2adc49dd4d46449adbb3266233" Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.498519 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"43e7322e9254fa3be7b5af4fc951cdba8fc2cde6b541bc191463575fb3b3ca17"} Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.498572 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0801043d0d5fd4d7b7d5cc90d23aad1b50f81491cd2b59cd0679379a2e43dcbe"} Mar 20 12:30:52 crc kubenswrapper[4817]: I0320 12:30:52.498588 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"50464f0cf8954b38cf329d7bea622b777375ea642a0b559b8478e7aa25671751"} Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.508931 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.509822 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.509908 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4baacce6549ba31f9173c5fcef7d6d43ef06e38517fad9add7b343b752a4f5a"} Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.514279 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c424f2b89717866790b917b4949e789a6e706b9f9283eabd9033b453b28d28b5"} Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.514349 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2fece9c62bc832d86f778993a7f707cb16389e25b372f7e6540f579e491240d0"} Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.514495 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.514573 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:53 crc kubenswrapper[4817]: I0320 12:30:53.514605 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:55 crc kubenswrapper[4817]: I0320 12:30:55.575611 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:30:55 crc kubenswrapper[4817]: I0320 12:30:55.576022 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:30:55 crc kubenswrapper[4817]: I0320 12:30:55.685075 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:55 crc kubenswrapper[4817]: I0320 12:30:55.685361 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:55 crc kubenswrapper[4817]: I0320 12:30:55.694399 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:56 crc kubenswrapper[4817]: I0320 12:30:56.735416 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerName="oauth-openshift" containerID="cri-o://911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76" gracePeriod=15 Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.322043 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.323500 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.323574 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.323716 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.323921 4817 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.324543 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424535 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424617 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424655 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424693 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424735 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424781 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424819 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424861 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl2wv\" (UniqueName: \"kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.424970 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.425018 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.425064 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.425342 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert\") pod \"a2493ed0-295f-4eba-8870-3f5716a76ca6\" (UID: \"a2493ed0-295f-4eba-8870-3f5716a76ca6\") " Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.425883 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.426879 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.427297 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.427390 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.433789 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.435354 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.435520 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv" (OuterVolumeSpecName: "kube-api-access-fl2wv") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "kube-api-access-fl2wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.436405 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.436820 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.437026 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.438039 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.438471 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.438925 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a2493ed0-295f-4eba-8870-3f5716a76ca6" (UID: "a2493ed0-295f-4eba-8870-3f5716a76ca6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.526657 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527042 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527055 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527066 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527078 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527087 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527149 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527160 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527169 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl2wv\" (UniqueName: \"kubernetes.io/projected/a2493ed0-295f-4eba-8870-3f5716a76ca6-kube-api-access-fl2wv\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527178 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527187 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.527197 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2493ed0-295f-4eba-8870-3f5716a76ca6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.550990 4817 generic.go:334] "Generic (PLEG): container finished" podID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerID="911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76" exitCode=0 Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.551066 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" event={"ID":"a2493ed0-295f-4eba-8870-3f5716a76ca6","Type":"ContainerDied","Data":"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76"} Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.551156 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" event={"ID":"a2493ed0-295f-4eba-8870-3f5716a76ca6","Type":"ContainerDied","Data":"33c897ec87113e97c15fdf27b6d5aa080866f955b2b97e25a34ade2963df6642"} Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.551191 4817 scope.go:117] "RemoveContainer" containerID="911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.551199 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v6wkw" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.585112 4817 scope.go:117] "RemoveContainer" containerID="911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76" Mar 20 12:30:57 crc kubenswrapper[4817]: E0320 12:30:57.585840 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76\": container with ID starting with 911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76 not found: ID does not exist" containerID="911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76" Mar 20 12:30:57 crc kubenswrapper[4817]: I0320 12:30:57.585917 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76"} err="failed to get container status \"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76\": rpc error: code = NotFound desc = could not find container \"911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76\": container with ID starting with 911b7bd737f4aaefa14bdaa92652b67d5587feee6412f508ae8da20d25212b76 not found: ID does not exist" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.528234 4817 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.560738 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.560767 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.566804 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.676460 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f6a8a96a-e2c4-4030-9f30-7f3abdeee54d" Mar 20 12:30:58 crc kubenswrapper[4817]: E0320 12:30:58.975263 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Mar 20 12:30:58 crc kubenswrapper[4817]: I0320 12:30:58.983211 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:30:59 crc kubenswrapper[4817]: E0320 12:30:59.185960 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Mar 20 12:30:59 crc kubenswrapper[4817]: I0320 12:30:59.568239 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:59 crc kubenswrapper[4817]: I0320 12:30:59.568295 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="acad212e-62b5-4f3e-921e-d682e1234d6c" Mar 20 12:30:59 crc kubenswrapper[4817]: I0320 12:30:59.573874 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f6a8a96a-e2c4-4030-9f30-7f3abdeee54d" Mar 20 12:31:01 crc kubenswrapper[4817]: I0320 12:31:01.226453 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:31:01 crc kubenswrapper[4817]: I0320 12:31:01.233477 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.386818 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.532286 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.552188 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.579182 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.661712 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.699721 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.913055 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 12:31:08 crc kubenswrapper[4817]: I0320 12:31:08.987990 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.031732 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.278230 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.586324 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.697609 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.712409 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.719643 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.881322 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:31:09 crc kubenswrapper[4817]: I0320 12:31:09.964407 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.032319 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.079846 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.426819 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.543451 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.677311 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.693787 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.694055 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 12:31:10 crc kubenswrapper[4817]: I0320 12:31:10.750260 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.094337 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.119953 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.152766 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.223422 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.277145 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.319495 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.333558 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.378989 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.395041 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.542265 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.605071 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.676272 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 12:31:11 crc kubenswrapper[4817]: I0320 12:31:11.925230 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.052958 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.078712 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.192774 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.210906 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.251838 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.313542 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.340664 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.359471 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.478609 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.557047 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.578384 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.590217 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.605412 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.619687 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.629919 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.738248 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.779265 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.792525 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.858525 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 12:31:12 crc kubenswrapper[4817]: I0320 12:31:12.883223 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.013224 4817 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.164960 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.339963 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.386952 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.445985 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.450075 4817 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.454284 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" podStartSLOduration=41.454266098 podStartE2EDuration="41.454266098s" podCreationTimestamp="2026-03-20 12:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:30:58.437071216 +0000 UTC m=+220.525383999" watchObservedRunningTime="2026-03-20 12:31:13.454266098 +0000 UTC m=+235.542578881" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.456743 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-v6wkw"] Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.456038 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.456797 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.466943 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.484183 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.484770 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.484748522 podStartE2EDuration="15.484748522s" podCreationTimestamp="2026-03-20 12:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:13.48369372 +0000 UTC m=+235.572006523" watchObservedRunningTime="2026-03-20 12:31:13.484748522 +0000 UTC m=+235.573061305" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.592719 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.592840 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.733937 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.857288 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.914003 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.915148 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 12:31:13 crc kubenswrapper[4817]: I0320 12:31:13.984775 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.164853 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.308384 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.334542 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.340850 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.342838 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.397794 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.409255 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.570433 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.604759 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.636478 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.645096 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.663846 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.671021 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" path="/var/lib/kubelet/pods/a2493ed0-295f-4eba-8870-3f5716a76ca6/volumes" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.744058 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.919826 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.931803 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 12:31:14 crc kubenswrapper[4817]: I0320 12:31:14.950606 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.029344 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.183394 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.192665 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.197439 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.207690 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.317234 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.379913 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.451779 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.454113 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.610632 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.627179 4817 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.639662 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.673090 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.680710 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.761737 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 12:31:15 crc kubenswrapper[4817]: I0320 12:31:15.947294 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.028270 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.065466 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.066567 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.093476 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.127749 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.167644 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.282250 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.358809 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.435777 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.541871 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.597323 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.624955 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.650482 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.680306 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 12:31:16 crc kubenswrapper[4817]: I0320 12:31:16.935682 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.261791 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.279410 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.307325 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.319235 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.389413 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.394747 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.396071 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.403964 4817 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.411914 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.436823 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.497804 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.575293 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.705685 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.807584 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.882522 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 12:31:17 crc kubenswrapper[4817]: I0320 12:31:17.883684 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.033642 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.062614 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.082847 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.116866 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.164959 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76749dd666-shlg4"] Mar 20 12:31:18 crc kubenswrapper[4817]: E0320 12:31:18.165309 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerName="oauth-openshift" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.165338 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerName="oauth-openshift" Mar 20 12:31:18 crc kubenswrapper[4817]: E0320 12:31:18.165360 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36838e6-8822-4760-a052-b888667f5a14" containerName="installer" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.165375 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36838e6-8822-4760-a052-b888667f5a14" containerName="installer" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.165576 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36838e6-8822-4760-a052-b888667f5a14" containerName="installer" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.165613 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2493ed0-295f-4eba-8870-3f5716a76ca6" containerName="oauth-openshift" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.166309 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.168253 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.168894 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.169264 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.170066 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.170275 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.170424 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.170832 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.171057 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.171353 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.171497 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.179553 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.179566 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.185430 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.188866 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.194431 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.247413 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.288273 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.299604 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.299967 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315227 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315274 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-error\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-dir\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315368 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-router-certs\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315392 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-session\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315418 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-login\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315481 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d872v\" (UniqueName: \"kubernetes.io/projected/dda1643f-fddc-4670-90ac-bcdc2d212931-kube-api-access-d872v\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315506 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-policies\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315533 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315560 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-service-ca\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315585 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315617 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.315652 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.328768 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416494 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416545 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-service-ca\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416565 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416591 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416616 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416637 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416652 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-error\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416677 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-dir\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416707 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-router-certs\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-session\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416743 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-login\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416759 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416787 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d872v\" (UniqueName: \"kubernetes.io/projected/dda1643f-fddc-4670-90ac-bcdc2d212931-kube-api-access-d872v\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.416802 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-policies\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.417553 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-policies\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.417845 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dda1643f-fddc-4670-90ac-bcdc2d212931-audit-dir\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.418769 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.419260 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-service-ca\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.419440 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.424754 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-session\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.426260 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.426447 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-error\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.426500 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-router-certs\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.426842 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.428545 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.430381 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.436923 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d872v\" (UniqueName: \"kubernetes.io/projected/dda1643f-fddc-4670-90ac-bcdc2d212931-kube-api-access-d872v\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.438771 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.442414 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dda1643f-fddc-4670-90ac-bcdc2d212931-v4-0-config-user-template-login\") pod \"oauth-openshift-76749dd666-shlg4\" (UID: \"dda1643f-fddc-4670-90ac-bcdc2d212931\") " pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.494057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.516465 4817 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.523618 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.544867 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.591876 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.597156 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.601894 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.788087 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.906742 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.920601 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 12:31:18 crc kubenswrapper[4817]: I0320 12:31:18.931267 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:18.964897 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:18.981249 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.040512 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.067043 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.075342 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.114274 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.134702 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.308111 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.469437 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.520488 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.520609 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.601292 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.699356 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.797455 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.865775 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 12:31:19 crc kubenswrapper[4817]: I0320 12:31:19.967333 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.018744 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.050155 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.051063 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.078434 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.109235 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.140130 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.557155 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.557634 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.558244 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.558800 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.565094 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.570868 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.631053 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.688488 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.705821 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.771632 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 12:31:20 crc kubenswrapper[4817]: I0320 12:31:20.822816 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.032944 4817 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.033219 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d" gracePeriod=5 Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.033970 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.056180 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.079897 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.170447 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.185158 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.234826 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.316158 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.355024 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76749dd666-shlg4"] Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.383658 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.407628 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.451374 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.468016 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.521598 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.552375 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.562633 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.612353 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.612406 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.711801 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.730717 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76749dd666-shlg4"] Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.743057 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.773182 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:31:21 crc kubenswrapper[4817]: I0320 12:31:21.785544 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.069165 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.122258 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.246789 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.278725 4817 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.413717 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.451251 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.525663 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.730060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" event={"ID":"dda1643f-fddc-4670-90ac-bcdc2d212931","Type":"ContainerStarted","Data":"081355eeb06f3f48ba4c4b25332b5b7b46225cdb7ac29f53a64f5e8c8767af02"} Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.730205 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" event={"ID":"dda1643f-fddc-4670-90ac-bcdc2d212931","Type":"ContainerStarted","Data":"b459b5a696a658ebbedde22c6af3e0157e90ce5a9d3ed5f6f28b18be3e7c10c4"} Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.734870 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.738941 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.747940 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.786189 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76749dd666-shlg4" podStartSLOduration=51.786149707999996 podStartE2EDuration="51.786149708s" podCreationTimestamp="2026-03-20 12:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:22.769504849 +0000 UTC m=+244.857817672" watchObservedRunningTime="2026-03-20 12:31:22.786149708 +0000 UTC m=+244.874462531" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.844250 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 12:31:22 crc kubenswrapper[4817]: I0320 12:31:22.968944 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.027669 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.059937 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.164972 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.194842 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.295186 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.591315 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.605977 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.630278 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 12:31:23 crc kubenswrapper[4817]: I0320 12:31:23.989197 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 12:31:24 crc kubenswrapper[4817]: I0320 12:31:24.048806 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 12:31:24 crc kubenswrapper[4817]: I0320 12:31:24.052297 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 12:31:24 crc kubenswrapper[4817]: I0320 12:31:24.091367 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 12:31:24 crc kubenswrapper[4817]: I0320 12:31:24.186434 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 12:31:24 crc kubenswrapper[4817]: I0320 12:31:24.505497 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.008893 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.088931 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.147514 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.308276 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.470719 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.574977 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:31:25 crc kubenswrapper[4817]: I0320 12:31:25.575403 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.675863 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.675935 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.775741 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.776294 4817 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d" exitCode=137 Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.776351 4817 scope.go:117] "RemoveContainer" containerID="0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.776433 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.806794 4817 scope.go:117] "RemoveContainer" containerID="0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d" Mar 20 12:31:26 crc kubenswrapper[4817]: E0320 12:31:26.808181 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d\": container with ID starting with 0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d not found: ID does not exist" containerID="0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.808260 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d"} err="failed to get container status \"0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d\": rpc error: code = NotFound desc = could not find container \"0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d\": container with ID starting with 0bb2f4956b868db9d5665765dee47d0593ea68dc4cdc402e30a9a84a68409a6d not found: ID does not exist" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.838913 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.839036 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.839069 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.839200 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.839230 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.840092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.840191 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.840186 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.843243 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.858548 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.941720 4817 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.941774 4817 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.941793 4817 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.941811 4817 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:26 crc kubenswrapper[4817]: I0320 12:31:26.941828 4817 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:28 crc kubenswrapper[4817]: I0320 12:31:28.675558 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.309147 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.309471 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" containerName="controller-manager" containerID="cri-o://0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5" gracePeriod=30 Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.400063 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.400272 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" podUID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" containerName="route-controller-manager" containerID="cri-o://d21617e509c20a59e0330791ce1d106edfa21ed325153fdc8fbf6fea1501b338" gracePeriod=30 Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.760871 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.835548 4817 generic.go:334] "Generic (PLEG): container finished" podID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" containerID="d21617e509c20a59e0330791ce1d106edfa21ed325153fdc8fbf6fea1501b338" exitCode=0 Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.835780 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" event={"ID":"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97","Type":"ContainerDied","Data":"d21617e509c20a59e0330791ce1d106edfa21ed325153fdc8fbf6fea1501b338"} Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.837467 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles\") pod \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.837509 4817 generic.go:334] "Generic (PLEG): container finished" podID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" containerID="0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5" exitCode=0 Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.837538 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" event={"ID":"2f22ca62-a967-4dba-a7bd-27d16d4f2214","Type":"ContainerDied","Data":"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5"} Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.838546 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" event={"ID":"2f22ca62-a967-4dba-a7bd-27d16d4f2214","Type":"ContainerDied","Data":"ab7832b4bf62e88ffcead90506627d73fb16fae9abbe724de41b650b6385f3dc"} Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.838401 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f22ca62-a967-4dba-a7bd-27d16d4f2214" (UID: "2f22ca62-a967-4dba-a7bd-27d16d4f2214"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.837589 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7d994-pddlv" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.838617 4817 scope.go:117] "RemoveContainer" containerID="0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.839243 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfhqq\" (UniqueName: \"kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq\") pod \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.839407 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert\") pod \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.839480 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config\") pod \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.839525 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca\") pod \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\" (UID: \"2f22ca62-a967-4dba-a7bd-27d16d4f2214\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.839997 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.840721 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f22ca62-a967-4dba-a7bd-27d16d4f2214" (UID: "2f22ca62-a967-4dba-a7bd-27d16d4f2214"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.840865 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config" (OuterVolumeSpecName: "config") pod "2f22ca62-a967-4dba-a7bd-27d16d4f2214" (UID: "2f22ca62-a967-4dba-a7bd-27d16d4f2214"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.845783 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f22ca62-a967-4dba-a7bd-27d16d4f2214" (UID: "2f22ca62-a967-4dba-a7bd-27d16d4f2214"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.847076 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq" (OuterVolumeSpecName: "kube-api-access-hfhqq") pod "2f22ca62-a967-4dba-a7bd-27d16d4f2214" (UID: "2f22ca62-a967-4dba-a7bd-27d16d4f2214"). InnerVolumeSpecName "kube-api-access-hfhqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.873754 4817 scope.go:117] "RemoveContainer" containerID="0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5" Mar 20 12:31:32 crc kubenswrapper[4817]: E0320 12:31:32.874304 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5\": container with ID starting with 0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5 not found: ID does not exist" containerID="0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.874347 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5"} err="failed to get container status \"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5\": rpc error: code = NotFound desc = could not find container \"0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5\": container with ID starting with 0aeb7c3d051167825d697505e9fc81e500b4ed681d17de77b5f1a8c323716db5 not found: ID does not exist" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.875926 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.941699 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config\") pod \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.941793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert\") pod \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942051 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxt2\" (UniqueName: \"kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2\") pod \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942209 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca\") pod \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\" (UID: \"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97\") " Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942776 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942829 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f22ca62-a967-4dba-a7bd-27d16d4f2214-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942860 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfhqq\" (UniqueName: \"kubernetes.io/projected/2f22ca62-a967-4dba-a7bd-27d16d4f2214-kube-api-access-hfhqq\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942888 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f22ca62-a967-4dba-a7bd-27d16d4f2214-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.942950 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" (UID: "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.943100 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config" (OuterVolumeSpecName: "config") pod "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" (UID: "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.945287 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" (UID: "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:31:32 crc kubenswrapper[4817]: I0320 12:31:32.946548 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2" (OuterVolumeSpecName: "kube-api-access-dvxt2") pod "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" (UID: "8f6a3ae8-1cee-4790-8fa4-37a7a613ed97"). InnerVolumeSpecName "kube-api-access-dvxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.044829 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvxt2\" (UniqueName: \"kubernetes.io/projected/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-kube-api-access-dvxt2\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.044867 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.044877 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.044886 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.191965 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.199416 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d7d994-pddlv"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.851079 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.851072 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7" event={"ID":"8f6a3ae8-1cee-4790-8fa4-37a7a613ed97","Type":"ContainerDied","Data":"587bd707e75526960ae5ed29817a23ab9805e3b89fc29ebbe3a8316fa4d2e804"} Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.851328 4817 scope.go:117] "RemoveContainer" containerID="d21617e509c20a59e0330791ce1d106edfa21ed325153fdc8fbf6fea1501b338" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.900201 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.906336 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659bc48566-4mkc7"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.924323 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:33 crc kubenswrapper[4817]: E0320 12:31:33.924717 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.924745 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 12:31:33 crc kubenswrapper[4817]: E0320 12:31:33.924772 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" containerName="controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.924789 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" containerName="controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: E0320 12:31:33.924821 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" containerName="route-controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.924836 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" containerName="route-controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.925003 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.925021 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" containerName="controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.925048 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" containerName="route-controller-manager" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.925724 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.929071 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.929829 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.930235 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.930784 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.933063 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.935056 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.936076 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.938004 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.944234 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.944424 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.944559 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.944704 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.944840 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.945013 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.945211 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.954226 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960064 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960139 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rrw\" (UniqueName: \"kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960161 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmwqs\" (UniqueName: \"kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960179 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960199 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960215 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960233 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960249 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960321 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:33 crc kubenswrapper[4817]: I0320 12:31:33.960407 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.061738 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.061804 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.061855 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rrw\" (UniqueName: \"kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.062297 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmwqs\" (UniqueName: \"kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.062327 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.062630 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.062660 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.063766 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.063798 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.064570 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.065390 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.065466 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.065861 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.066251 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.066505 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.069243 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.080363 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmwqs\" (UniqueName: \"kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs\") pod \"route-controller-manager-9dccc7df6-nsrj2\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.081452 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rrw\" (UniqueName: \"kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw\") pod \"controller-manager-6ffdfbfd5b-s6rn9\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.297830 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.306404 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.584336 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.670891 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f22ca62-a967-4dba-a7bd-27d16d4f2214" path="/var/lib/kubelet/pods/2f22ca62-a967-4dba-a7bd-27d16d4f2214/volumes" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.671844 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6a3ae8-1cee-4790-8fa4-37a7a613ed97" path="/var/lib/kubelet/pods/8f6a3ae8-1cee-4790-8fa4-37a7a613ed97/volumes" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.735288 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:34 crc kubenswrapper[4817]: W0320 12:31:34.740838 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d278f3_64a9_4c83_8deb_6cbaa53d16d0.slice/crio-e817c80ba3ea341482c9ce7e7acebe8d651f2ba1427b5cc8b775f5d53f4266c0 WatchSource:0}: Error finding container e817c80ba3ea341482c9ce7e7acebe8d651f2ba1427b5cc8b775f5d53f4266c0: Status 404 returned error can't find the container with id e817c80ba3ea341482c9ce7e7acebe8d651f2ba1427b5cc8b775f5d53f4266c0 Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.863267 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" event={"ID":"50160eae-356b-49b7-ab43-92c6b72cd05d","Type":"ContainerStarted","Data":"d4f9b96f23308d5d1c5956f4ef7179640fdcf67ed5886dd36976e03f7fcf44f6"} Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.864078 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.864160 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" event={"ID":"50160eae-356b-49b7-ab43-92c6b72cd05d","Type":"ContainerStarted","Data":"67b70498279072f4f1ccdb0a33c60a3343b3bca69e315c9bde41493b0731fec4"} Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.864305 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" event={"ID":"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0","Type":"ContainerStarted","Data":"e817c80ba3ea341482c9ce7e7acebe8d651f2ba1427b5cc8b775f5d53f4266c0"} Mar 20 12:31:34 crc kubenswrapper[4817]: I0320 12:31:34.880861 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" podStartSLOduration=2.880847286 podStartE2EDuration="2.880847286s" podCreationTimestamp="2026-03-20 12:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:34.880678981 +0000 UTC m=+256.968991784" watchObservedRunningTime="2026-03-20 12:31:34.880847286 +0000 UTC m=+256.969160059" Mar 20 12:31:35 crc kubenswrapper[4817]: I0320 12:31:35.073452 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:35 crc kubenswrapper[4817]: I0320 12:31:35.870880 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" event={"ID":"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0","Type":"ContainerStarted","Data":"218cb9373c117a1e728dffb3562c78f69ff19725fdaf94a80650197ff0497a69"} Mar 20 12:31:35 crc kubenswrapper[4817]: I0320 12:31:35.871290 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:35 crc kubenswrapper[4817]: I0320 12:31:35.875890 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:35 crc kubenswrapper[4817]: I0320 12:31:35.910456 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" podStartSLOduration=3.910433393 podStartE2EDuration="3.910433393s" podCreationTimestamp="2026-03-20 12:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:35.889549577 +0000 UTC m=+257.977862360" watchObservedRunningTime="2026-03-20 12:31:35.910433393 +0000 UTC m=+257.998746176" Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.565766 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.566757 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" podUID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" containerName="controller-manager" containerID="cri-o://218cb9373c117a1e728dffb3562c78f69ff19725fdaf94a80650197ff0497a69" gracePeriod=30 Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.594718 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.595410 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" podUID="50160eae-356b-49b7-ab43-92c6b72cd05d" containerName="route-controller-manager" containerID="cri-o://d4f9b96f23308d5d1c5956f4ef7179640fdcf67ed5886dd36976e03f7fcf44f6" gracePeriod=30 Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.978021 4817 generic.go:334] "Generic (PLEG): container finished" podID="50160eae-356b-49b7-ab43-92c6b72cd05d" containerID="d4f9b96f23308d5d1c5956f4ef7179640fdcf67ed5886dd36976e03f7fcf44f6" exitCode=0 Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.978240 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" event={"ID":"50160eae-356b-49b7-ab43-92c6b72cd05d","Type":"ContainerDied","Data":"d4f9b96f23308d5d1c5956f4ef7179640fdcf67ed5886dd36976e03f7fcf44f6"} Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.981302 4817 generic.go:334] "Generic (PLEG): container finished" podID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" containerID="218cb9373c117a1e728dffb3562c78f69ff19725fdaf94a80650197ff0497a69" exitCode=0 Mar 20 12:31:52 crc kubenswrapper[4817]: I0320 12:31:52.981362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" event={"ID":"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0","Type":"ContainerDied","Data":"218cb9373c117a1e728dffb3562c78f69ff19725fdaf94a80650197ff0497a69"} Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.666677 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.698355 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:31:53 crc kubenswrapper[4817]: E0320 12:31:53.699000 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50160eae-356b-49b7-ab43-92c6b72cd05d" containerName="route-controller-manager" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.699061 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="50160eae-356b-49b7-ab43-92c6b72cd05d" containerName="route-controller-manager" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.699334 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="50160eae-356b-49b7-ab43-92c6b72cd05d" containerName="route-controller-manager" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.700033 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.708318 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.760771 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca\") pod \"50160eae-356b-49b7-ab43-92c6b72cd05d\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.760857 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmwqs\" (UniqueName: \"kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs\") pod \"50160eae-356b-49b7-ab43-92c6b72cd05d\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.760903 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert\") pod \"50160eae-356b-49b7-ab43-92c6b72cd05d\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.760992 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config\") pod \"50160eae-356b-49b7-ab43-92c6b72cd05d\" (UID: \"50160eae-356b-49b7-ab43-92c6b72cd05d\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.761162 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp6c4\" (UniqueName: \"kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.761195 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.761261 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.761284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.762264 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca" (OuterVolumeSpecName: "client-ca") pod "50160eae-356b-49b7-ab43-92c6b72cd05d" (UID: "50160eae-356b-49b7-ab43-92c6b72cd05d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.762344 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config" (OuterVolumeSpecName: "config") pod "50160eae-356b-49b7-ab43-92c6b72cd05d" (UID: "50160eae-356b-49b7-ab43-92c6b72cd05d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.765912 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "50160eae-356b-49b7-ab43-92c6b72cd05d" (UID: "50160eae-356b-49b7-ab43-92c6b72cd05d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.766181 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs" (OuterVolumeSpecName: "kube-api-access-mmwqs") pod "50160eae-356b-49b7-ab43-92c6b72cd05d" (UID: "50160eae-356b-49b7-ab43-92c6b72cd05d"). InnerVolumeSpecName "kube-api-access-mmwqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.791788 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.862321 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert\") pod \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.862419 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca\") pod \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.862663 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config\") pod \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.864032 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" (UID: "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.864091 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config" (OuterVolumeSpecName: "config") pod "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" (UID: "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866085 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles\") pod \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866332 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7rrw\" (UniqueName: \"kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw\") pod \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\" (UID: \"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0\") " Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866558 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866604 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866699 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp6c4\" (UniqueName: \"kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866735 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866861 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866882 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50160eae-356b-49b7-ab43-92c6b72cd05d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866896 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866909 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmwqs\" (UniqueName: \"kubernetes.io/projected/50160eae-356b-49b7-ab43-92c6b72cd05d-kube-api-access-mmwqs\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866922 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.866933 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50160eae-356b-49b7-ab43-92c6b72cd05d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.868687 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.870630 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" (UID: "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.871261 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" (UID: "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.872082 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.875526 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.877240 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw" (OuterVolumeSpecName: "kube-api-access-z7rrw") pod "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" (UID: "a9d278f3-64a9-4c83-8deb-6cbaa53d16d0"). InnerVolumeSpecName "kube-api-access-z7rrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.884691 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp6c4\" (UniqueName: \"kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4\") pod \"route-controller-manager-9db9fd7fb-6r5mx\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.970157 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.970202 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.970219 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7rrw\" (UniqueName: \"kubernetes.io/projected/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0-kube-api-access-z7rrw\") on node \"crc\" DevicePath \"\"" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.990733 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" event={"ID":"a9d278f3-64a9-4c83-8deb-6cbaa53d16d0","Type":"ContainerDied","Data":"e817c80ba3ea341482c9ce7e7acebe8d651f2ba1427b5cc8b775f5d53f4266c0"} Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.990792 4817 scope.go:117] "RemoveContainer" containerID="218cb9373c117a1e728dffb3562c78f69ff19725fdaf94a80650197ff0497a69" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.990789 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9" Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.995186 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" event={"ID":"50160eae-356b-49b7-ab43-92c6b72cd05d","Type":"ContainerDied","Data":"67b70498279072f4f1ccdb0a33c60a3343b3bca69e315c9bde41493b0731fec4"} Mar 20 12:31:53 crc kubenswrapper[4817]: I0320 12:31:53.995216 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2" Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.019806 4817 scope.go:117] "RemoveContainer" containerID="d4f9b96f23308d5d1c5956f4ef7179640fdcf67ed5886dd36976e03f7fcf44f6" Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.023873 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.037593 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.051390 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-nsrj2"] Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.061516 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.070589 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-s6rn9"] Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.672213 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50160eae-356b-49b7-ab43-92c6b72cd05d" path="/var/lib/kubelet/pods/50160eae-356b-49b7-ab43-92c6b72cd05d/volumes" Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.673222 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" path="/var/lib/kubelet/pods/a9d278f3-64a9-4c83-8deb-6cbaa53d16d0/volumes" Mar 20 12:31:54 crc kubenswrapper[4817]: I0320 12:31:54.754335 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:31:54 crc kubenswrapper[4817]: W0320 12:31:54.764922 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b651e8_7776_49fd_8dfd_840583cf6b67.slice/crio-9e13cf13308d3794227b232a78232ce173a3ec8e08a6ab6281ffdf6471203b99 WatchSource:0}: Error finding container 9e13cf13308d3794227b232a78232ce173a3ec8e08a6ab6281ffdf6471203b99: Status 404 returned error can't find the container with id 9e13cf13308d3794227b232a78232ce173a3ec8e08a6ab6281ffdf6471203b99 Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.013039 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" event={"ID":"32b651e8-7776-49fd-8dfd-840583cf6b67","Type":"ContainerStarted","Data":"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432"} Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.013462 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" event={"ID":"32b651e8-7776-49fd-8dfd-840583cf6b67","Type":"ContainerStarted","Data":"9e13cf13308d3794227b232a78232ce173a3ec8e08a6ab6281ffdf6471203b99"} Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.013485 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.034519 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" podStartSLOduration=3.034500651 podStartE2EDuration="3.034500651s" podCreationTimestamp="2026-03-20 12:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:55.032490133 +0000 UTC m=+277.120802956" watchObservedRunningTime="2026-03-20 12:31:55.034500651 +0000 UTC m=+277.122813434" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.330022 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.499785 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.574850 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.575181 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.575325 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.575949 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420"} pod="openshift-machine-config-operator/machine-config-daemon-dch6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.576105 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" containerID="cri-o://bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420" gracePeriod=600 Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.929978 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:31:55 crc kubenswrapper[4817]: E0320 12:31:55.930238 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" containerName="controller-manager" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.930256 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" containerName="controller-manager" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.930377 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d278f3-64a9-4c83-8deb-6cbaa53d16d0" containerName="controller-manager" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.930733 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.932774 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.936223 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.936584 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.936742 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.937529 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.940846 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.940872 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.944763 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.998824 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.999157 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.999250 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.999440 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9s5\" (UniqueName: \"kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:55 crc kubenswrapper[4817]: I0320 12:31:55.999496 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.024232 4817 generic.go:334] "Generic (PLEG): container finished" podID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerID="bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420" exitCode=0 Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.024346 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerDied","Data":"bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420"} Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.024416 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerStarted","Data":"bf1ae3dbdd47367c661bf7c7e6a843dda21fb19d5950ad9e49f9dd2202c0159b"} Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.101825 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.102072 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.102257 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9s5\" (UniqueName: \"kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.102341 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.103188 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.104607 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.104807 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.107697 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.109744 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.119002 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9s5\" (UniqueName: \"kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5\") pod \"controller-manager-5d8d8f6646-fdtmz\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.245613 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:56 crc kubenswrapper[4817]: I0320 12:31:56.705917 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:31:56 crc kubenswrapper[4817]: W0320 12:31:56.723175 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785a671e_d76b_4644_add0_cc678616732a.slice/crio-b950ac3cb4f7cb73c3f312f8f5eb0de563ca1e21ddf4f7b91ce524d439b245ad WatchSource:0}: Error finding container b950ac3cb4f7cb73c3f312f8f5eb0de563ca1e21ddf4f7b91ce524d439b245ad: Status 404 returned error can't find the container with id b950ac3cb4f7cb73c3f312f8f5eb0de563ca1e21ddf4f7b91ce524d439b245ad Mar 20 12:31:57 crc kubenswrapper[4817]: I0320 12:31:57.033119 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" event={"ID":"785a671e-d76b-4644-add0-cc678616732a","Type":"ContainerStarted","Data":"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f"} Mar 20 12:31:57 crc kubenswrapper[4817]: I0320 12:31:57.033439 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" event={"ID":"785a671e-d76b-4644-add0-cc678616732a","Type":"ContainerStarted","Data":"b950ac3cb4f7cb73c3f312f8f5eb0de563ca1e21ddf4f7b91ce524d439b245ad"} Mar 20 12:31:57 crc kubenswrapper[4817]: I0320 12:31:57.034919 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:57 crc kubenswrapper[4817]: I0320 12:31:57.048163 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:31:57 crc kubenswrapper[4817]: I0320 12:31:57.066784 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" podStartSLOduration=5.066759729 podStartE2EDuration="5.066759729s" podCreationTimestamp="2026-03-20 12:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:31:57.0644644 +0000 UTC m=+279.152777183" watchObservedRunningTime="2026-03-20 12:31:57.066759729 +0000 UTC m=+279.155072512" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.639996 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n7sgd"] Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.641050 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.654206 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n7sgd"] Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-registry-certificates\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753588 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-trusted-ca\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753617 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-bound-sa-token\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753647 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753666 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9jc\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-kube-api-access-hv9jc\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36db16cc-fbe1-477a-83e7-314307888af9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753778 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-registry-tls\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.753821 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36db16cc-fbe1-477a-83e7-314307888af9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.780768 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.855569 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-trusted-ca\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.855900 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-bound-sa-token\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.855941 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9jc\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-kube-api-access-hv9jc\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.855970 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36db16cc-fbe1-477a-83e7-314307888af9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.856028 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-registry-tls\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.856046 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36db16cc-fbe1-477a-83e7-314307888af9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.856074 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-registry-certificates\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.856505 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36db16cc-fbe1-477a-83e7-314307888af9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.857221 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-trusted-ca\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.857371 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36db16cc-fbe1-477a-83e7-314307888af9-registry-certificates\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.862589 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36db16cc-fbe1-477a-83e7-314307888af9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.865954 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-registry-tls\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.871776 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-bound-sa-token\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.877540 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9jc\" (UniqueName: \"kubernetes.io/projected/36db16cc-fbe1-477a-83e7-314307888af9-kube-api-access-hv9jc\") pod \"image-registry-66df7c8f76-n7sgd\" (UID: \"36db16cc-fbe1-477a-83e7-314307888af9\") " pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:31:59 crc kubenswrapper[4817]: I0320 12:31:59.955205 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.166646 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566832-5fk4d"] Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.168064 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.171567 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.171594 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lqzqd" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.173059 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.174840 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566832-5fk4d"] Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.261303 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpbj\" (UniqueName: \"kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj\") pod \"auto-csr-approver-29566832-5fk4d\" (UID: \"1f0aed4e-9067-4392-86d2-97e840863422\") " pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.364460 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpbj\" (UniqueName: \"kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj\") pod \"auto-csr-approver-29566832-5fk4d\" (UID: \"1f0aed4e-9067-4392-86d2-97e840863422\") " pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.391817 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpbj\" (UniqueName: \"kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj\") pod \"auto-csr-approver-29566832-5fk4d\" (UID: \"1f0aed4e-9067-4392-86d2-97e840863422\") " pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.428347 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n7sgd"] Mar 20 12:32:00 crc kubenswrapper[4817]: W0320 12:32:00.430589 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36db16cc_fbe1_477a_83e7_314307888af9.slice/crio-9f1efee99cdd68872b42d97ef6b0ef623c273f11e7dd45d86b637a20900fa1cf WatchSource:0}: Error finding container 9f1efee99cdd68872b42d97ef6b0ef623c273f11e7dd45d86b637a20900fa1cf: Status 404 returned error can't find the container with id 9f1efee99cdd68872b42d97ef6b0ef623c273f11e7dd45d86b637a20900fa1cf Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.487588 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:00 crc kubenswrapper[4817]: I0320 12:32:00.894601 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566832-5fk4d"] Mar 20 12:32:01 crc kubenswrapper[4817]: I0320 12:32:01.053752 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" event={"ID":"36db16cc-fbe1-477a-83e7-314307888af9","Type":"ContainerStarted","Data":"110c69db3164d9c725f6292207e2065c5b2ebbb220a5374622ce98385338b846"} Mar 20 12:32:01 crc kubenswrapper[4817]: I0320 12:32:01.053829 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:32:01 crc kubenswrapper[4817]: I0320 12:32:01.053845 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" event={"ID":"36db16cc-fbe1-477a-83e7-314307888af9","Type":"ContainerStarted","Data":"9f1efee99cdd68872b42d97ef6b0ef623c273f11e7dd45d86b637a20900fa1cf"} Mar 20 12:32:01 crc kubenswrapper[4817]: I0320 12:32:01.054794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" event={"ID":"1f0aed4e-9067-4392-86d2-97e840863422","Type":"ContainerStarted","Data":"d3d1b365b7c5c36b93a2186a8fe44314df62384f3ee94151f27fbf558edde50f"} Mar 20 12:32:01 crc kubenswrapper[4817]: I0320 12:32:01.074023 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" podStartSLOduration=2.074004797 podStartE2EDuration="2.074004797s" podCreationTimestamp="2026-03-20 12:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:32:01.069928868 +0000 UTC m=+283.158241671" watchObservedRunningTime="2026-03-20 12:32:01.074004797 +0000 UTC m=+283.162317600" Mar 20 12:32:03 crc kubenswrapper[4817]: I0320 12:32:03.068843 4817 generic.go:334] "Generic (PLEG): container finished" podID="1f0aed4e-9067-4392-86d2-97e840863422" containerID="5b3c82efe0d8ff750c1e9de287eced4e83b4420d707953c42b30021c810c51a3" exitCode=0 Mar 20 12:32:03 crc kubenswrapper[4817]: I0320 12:32:03.068930 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" event={"ID":"1f0aed4e-9067-4392-86d2-97e840863422","Type":"ContainerDied","Data":"5b3c82efe0d8ff750c1e9de287eced4e83b4420d707953c42b30021c810c51a3"} Mar 20 12:32:04 crc kubenswrapper[4817]: I0320 12:32:04.561728 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:04 crc kubenswrapper[4817]: I0320 12:32:04.653460 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cpbj\" (UniqueName: \"kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj\") pod \"1f0aed4e-9067-4392-86d2-97e840863422\" (UID: \"1f0aed4e-9067-4392-86d2-97e840863422\") " Mar 20 12:32:04 crc kubenswrapper[4817]: I0320 12:32:04.659514 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj" (OuterVolumeSpecName: "kube-api-access-8cpbj") pod "1f0aed4e-9067-4392-86d2-97e840863422" (UID: "1f0aed4e-9067-4392-86d2-97e840863422"). InnerVolumeSpecName "kube-api-access-8cpbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:04 crc kubenswrapper[4817]: I0320 12:32:04.755507 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cpbj\" (UniqueName: \"kubernetes.io/projected/1f0aed4e-9067-4392-86d2-97e840863422-kube-api-access-8cpbj\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:05 crc kubenswrapper[4817]: I0320 12:32:05.085701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" event={"ID":"1f0aed4e-9067-4392-86d2-97e840863422","Type":"ContainerDied","Data":"d3d1b365b7c5c36b93a2186a8fe44314df62384f3ee94151f27fbf558edde50f"} Mar 20 12:32:05 crc kubenswrapper[4817]: I0320 12:32:05.086041 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d1b365b7c5c36b93a2186a8fe44314df62384f3ee94151f27fbf558edde50f" Mar 20 12:32:05 crc kubenswrapper[4817]: I0320 12:32:05.085804 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566832-5fk4d" Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.274906 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.275689 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" podUID="785a671e-d76b-4644-add0-cc678616732a" containerName="controller-manager" containerID="cri-o://88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f" gracePeriod=30 Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.294897 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.295138 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" podUID="32b651e8-7776-49fd-8dfd-840583cf6b67" containerName="route-controller-manager" containerID="cri-o://eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432" gracePeriod=30 Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.973649 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:32:12 crc kubenswrapper[4817]: I0320 12:32:12.977783 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.077963 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md9s5\" (UniqueName: \"kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5\") pod \"785a671e-d76b-4644-add0-cc678616732a\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.078027 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert\") pod \"32b651e8-7776-49fd-8dfd-840583cf6b67\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.078793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca\") pod \"32b651e8-7776-49fd-8dfd-840583cf6b67\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.078842 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config\") pod \"785a671e-d76b-4644-add0-cc678616732a\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.078986 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca\") pod \"785a671e-d76b-4644-add0-cc678616732a\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079003 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert\") pod \"785a671e-d76b-4644-add0-cc678616732a\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079316 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles\") pod \"785a671e-d76b-4644-add0-cc678616732a\" (UID: \"785a671e-d76b-4644-add0-cc678616732a\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079355 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp6c4\" (UniqueName: \"kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4\") pod \"32b651e8-7776-49fd-8dfd-840583cf6b67\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079341 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca" (OuterVolumeSpecName: "client-ca") pod "32b651e8-7776-49fd-8dfd-840583cf6b67" (UID: "32b651e8-7776-49fd-8dfd-840583cf6b67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079413 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca" (OuterVolumeSpecName: "client-ca") pod "785a671e-d76b-4644-add0-cc678616732a" (UID: "785a671e-d76b-4644-add0-cc678616732a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079479 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config" (OuterVolumeSpecName: "config") pod "785a671e-d76b-4644-add0-cc678616732a" (UID: "785a671e-d76b-4644-add0-cc678616732a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079506 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config\") pod \"32b651e8-7776-49fd-8dfd-840583cf6b67\" (UID: \"32b651e8-7776-49fd-8dfd-840583cf6b67\") " Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079830 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "785a671e-d76b-4644-add0-cc678616732a" (UID: "785a671e-d76b-4644-add0-cc678616732a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079950 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config" (OuterVolumeSpecName: "config") pod "32b651e8-7776-49fd-8dfd-840583cf6b67" (UID: "32b651e8-7776-49fd-8dfd-840583cf6b67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.079991 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.080032 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.080043 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.080051 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/785a671e-d76b-4644-add0-cc678616732a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.083753 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32b651e8-7776-49fd-8dfd-840583cf6b67" (UID: "32b651e8-7776-49fd-8dfd-840583cf6b67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.083775 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5" (OuterVolumeSpecName: "kube-api-access-md9s5") pod "785a671e-d76b-4644-add0-cc678616732a" (UID: "785a671e-d76b-4644-add0-cc678616732a"). InnerVolumeSpecName "kube-api-access-md9s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.083854 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4" (OuterVolumeSpecName: "kube-api-access-bp6c4") pod "32b651e8-7776-49fd-8dfd-840583cf6b67" (UID: "32b651e8-7776-49fd-8dfd-840583cf6b67"). InnerVolumeSpecName "kube-api-access-bp6c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.083924 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "785a671e-d76b-4644-add0-cc678616732a" (UID: "785a671e-d76b-4644-add0-cc678616732a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.142871 4817 generic.go:334] "Generic (PLEG): container finished" podID="785a671e-d76b-4644-add0-cc678616732a" containerID="88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f" exitCode=0 Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.142917 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.143000 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" event={"ID":"785a671e-d76b-4644-add0-cc678616732a","Type":"ContainerDied","Data":"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f"} Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.143055 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz" event={"ID":"785a671e-d76b-4644-add0-cc678616732a","Type":"ContainerDied","Data":"b950ac3cb4f7cb73c3f312f8f5eb0de563ca1e21ddf4f7b91ce524d439b245ad"} Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.143082 4817 scope.go:117] "RemoveContainer" containerID="88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.144585 4817 generic.go:334] "Generic (PLEG): container finished" podID="32b651e8-7776-49fd-8dfd-840583cf6b67" containerID="eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432" exitCode=0 Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.144625 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.144656 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" event={"ID":"32b651e8-7776-49fd-8dfd-840583cf6b67","Type":"ContainerDied","Data":"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432"} Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.144745 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx" event={"ID":"32b651e8-7776-49fd-8dfd-840583cf6b67","Type":"ContainerDied","Data":"9e13cf13308d3794227b232a78232ce173a3ec8e08a6ab6281ffdf6471203b99"} Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.160229 4817 scope.go:117] "RemoveContainer" containerID="88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f" Mar 20 12:32:13 crc kubenswrapper[4817]: E0320 12:32:13.160605 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f\": container with ID starting with 88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f not found: ID does not exist" containerID="88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.160687 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f"} err="failed to get container status \"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f\": rpc error: code = NotFound desc = could not find container \"88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f\": container with ID starting with 88f031e5e341c5a22df1c6a673f1c17af2a0926c427bff297900ba7eda15b49f not found: ID does not exist" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.160718 4817 scope.go:117] "RemoveContainer" containerID="eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.177836 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.181402 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md9s5\" (UniqueName: \"kubernetes.io/projected/785a671e-d76b-4644-add0-cc678616732a-kube-api-access-md9s5\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.181425 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b651e8-7776-49fd-8dfd-840583cf6b67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.181468 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785a671e-d76b-4644-add0-cc678616732a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.181482 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp6c4\" (UniqueName: \"kubernetes.io/projected/32b651e8-7776-49fd-8dfd-840583cf6b67-kube-api-access-bp6c4\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.181495 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b651e8-7776-49fd-8dfd-840583cf6b67-config\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.182090 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9fd7fb-6r5mx"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.188198 4817 scope.go:117] "RemoveContainer" containerID="eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432" Mar 20 12:32:13 crc kubenswrapper[4817]: E0320 12:32:13.188628 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432\": container with ID starting with eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432 not found: ID does not exist" containerID="eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.188657 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432"} err="failed to get container status \"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432\": rpc error: code = NotFound desc = could not find container \"eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432\": container with ID starting with eb909ae1407aa170148434ac6fc44df47394f26419f744566d396f9ba97e6432 not found: ID does not exist" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.190213 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.193265 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d8d8f6646-fdtmz"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.943622 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s"] Mar 20 12:32:13 crc kubenswrapper[4817]: E0320 12:32:13.944588 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0aed4e-9067-4392-86d2-97e840863422" containerName="oc" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.944711 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0aed4e-9067-4392-86d2-97e840863422" containerName="oc" Mar 20 12:32:13 crc kubenswrapper[4817]: E0320 12:32:13.944754 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785a671e-d76b-4644-add0-cc678616732a" containerName="controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.944773 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="785a671e-d76b-4644-add0-cc678616732a" containerName="controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: E0320 12:32:13.944820 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b651e8-7776-49fd-8dfd-840583cf6b67" containerName="route-controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.944843 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b651e8-7776-49fd-8dfd-840583cf6b67" containerName="route-controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.945163 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0aed4e-9067-4392-86d2-97e840863422" containerName="oc" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.945200 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b651e8-7776-49fd-8dfd-840583cf6b67" containerName="route-controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.945235 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="785a671e-d76b-4644-add0-cc678616732a" containerName="controller-manager" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.946163 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.947138 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.947835 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.953310 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.956572 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.957274 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.957776 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962734 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962915 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962735 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962806 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962813 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.962829 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.963554 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.963612 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.971176 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.981319 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h"] Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991480 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-config\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991546 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-client-ca\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991583 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchpz\" (UniqueName: \"kubernetes.io/projected/db880e58-57d3-4c3f-9d13-e6349a2f0659-kube-api-access-bchpz\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991608 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db880e58-57d3-4c3f-9d13-e6349a2f0659-serving-cert\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991636 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-config\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmjfw\" (UniqueName: \"kubernetes.io/projected/3f5dad7b-574e-484d-a76a-56446e2e4e92-kube-api-access-nmjfw\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991924 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-client-ca\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.991961 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5dad7b-574e-484d-a76a-56446e2e4e92-serving-cert\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:13 crc kubenswrapper[4817]: I0320 12:32:13.992021 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.004422 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s"] Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093316 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-client-ca\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093358 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db880e58-57d3-4c3f-9d13-e6349a2f0659-serving-cert\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093380 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchpz\" (UniqueName: \"kubernetes.io/projected/db880e58-57d3-4c3f-9d13-e6349a2f0659-kube-api-access-bchpz\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093409 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-config\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093427 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmjfw\" (UniqueName: \"kubernetes.io/projected/3f5dad7b-574e-484d-a76a-56446e2e4e92-kube-api-access-nmjfw\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093471 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-client-ca\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093485 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5dad7b-574e-484d-a76a-56446e2e4e92-serving-cert\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093504 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.093542 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-config\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.094430 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-client-ca\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.095283 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-config\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.096090 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db880e58-57d3-4c3f-9d13-e6349a2f0659-client-ca\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.097303 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-proxy-ca-bundles\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.098821 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f5dad7b-574e-484d-a76a-56446e2e4e92-config\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.101519 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db880e58-57d3-4c3f-9d13-e6349a2f0659-serving-cert\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.101643 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f5dad7b-574e-484d-a76a-56446e2e4e92-serving-cert\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.109597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmjfw\" (UniqueName: \"kubernetes.io/projected/3f5dad7b-574e-484d-a76a-56446e2e4e92-kube-api-access-nmjfw\") pod \"controller-manager-6ffdfbfd5b-wjd6h\" (UID: \"3f5dad7b-574e-484d-a76a-56446e2e4e92\") " pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.112510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchpz\" (UniqueName: \"kubernetes.io/projected/db880e58-57d3-4c3f-9d13-e6349a2f0659-kube-api-access-bchpz\") pod \"route-controller-manager-9dccc7df6-6nx2s\" (UID: \"db880e58-57d3-4c3f-9d13-e6349a2f0659\") " pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.269416 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.288784 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.677659 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b651e8-7776-49fd-8dfd-840583cf6b67" path="/var/lib/kubelet/pods/32b651e8-7776-49fd-8dfd-840583cf6b67/volumes" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.679230 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785a671e-d76b-4644-add0-cc678616732a" path="/var/lib/kubelet/pods/785a671e-d76b-4644-add0-cc678616732a/volumes" Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.692516 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s"] Mar 20 12:32:14 crc kubenswrapper[4817]: I0320 12:32:14.769383 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h"] Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.158619 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" event={"ID":"db880e58-57d3-4c3f-9d13-e6349a2f0659","Type":"ContainerStarted","Data":"09d9fed6358a50bcc8a0cccffce274d8ec52dbb4e51842d114b99bc9705e0d48"} Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.159044 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" event={"ID":"db880e58-57d3-4c3f-9d13-e6349a2f0659","Type":"ContainerStarted","Data":"163a22a789339991991c540c5b73fba4cace4bea7173249f819ab19bdbc66816"} Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.160572 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.172993 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" event={"ID":"3f5dad7b-574e-484d-a76a-56446e2e4e92","Type":"ContainerStarted","Data":"f45ba3b4d87b589c79a7ab255618943af5fb994a8680f662fc7592778a469b1b"} Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.173136 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" event={"ID":"3f5dad7b-574e-484d-a76a-56446e2e4e92","Type":"ContainerStarted","Data":"08cb56422528d28811211b4434b743d83a895988d1f24b750d3ad964388513a8"} Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.173172 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.187839 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" podStartSLOduration=3.187809085 podStartE2EDuration="3.187809085s" podCreationTimestamp="2026-03-20 12:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:32:15.18127733 +0000 UTC m=+297.269590123" watchObservedRunningTime="2026-03-20 12:32:15.187809085 +0000 UTC m=+297.276121878" Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.191836 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.211384 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ffdfbfd5b-wjd6h" podStartSLOduration=3.211355562 podStartE2EDuration="3.211355562s" podCreationTimestamp="2026-03-20 12:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:32:15.207883127 +0000 UTC m=+297.296195910" watchObservedRunningTime="2026-03-20 12:32:15.211355562 +0000 UTC m=+297.299668385" Mar 20 12:32:15 crc kubenswrapper[4817]: I0320 12:32:15.421087 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9dccc7df6-6nx2s" Mar 20 12:32:19 crc kubenswrapper[4817]: I0320 12:32:19.961191 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-n7sgd" Mar 20 12:32:20 crc kubenswrapper[4817]: I0320 12:32:20.008412 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.047551 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" podUID="bae6b6df-ce5b-473f-b03d-07b9d4380961" containerName="registry" containerID="cri-o://8453e26be0ed7fff720b5e9e12356fde75fb42563dd80f24adb0c2a99f53747b" gracePeriod=30 Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.361478 4817 generic.go:334] "Generic (PLEG): container finished" podID="bae6b6df-ce5b-473f-b03d-07b9d4380961" containerID="8453e26be0ed7fff720b5e9e12356fde75fb42563dd80f24adb0c2a99f53747b" exitCode=0 Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.361540 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" event={"ID":"bae6b6df-ce5b-473f-b03d-07b9d4380961","Type":"ContainerDied","Data":"8453e26be0ed7fff720b5e9e12356fde75fb42563dd80f24adb0c2a99f53747b"} Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.569163 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663518 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663573 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663600 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bst9l\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663645 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663664 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663841 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663896 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.663929 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted\") pod \"bae6b6df-ce5b-473f-b03d-07b9d4380961\" (UID: \"bae6b6df-ce5b-473f-b03d-07b9d4380961\") " Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.665414 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.665588 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.671782 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.671948 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l" (OuterVolumeSpecName: "kube-api-access-bst9l") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "kube-api-access-bst9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.672199 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.673469 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.679910 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.681803 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bae6b6df-ce5b-473f-b03d-07b9d4380961" (UID: "bae6b6df-ce5b-473f-b03d-07b9d4380961"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.765803 4817 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bae6b6df-ce5b-473f-b03d-07b9d4380961-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766165 4817 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766282 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bae6b6df-ce5b-473f-b03d-07b9d4380961-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766394 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bst9l\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-kube-api-access-bst9l\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766473 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766590 4817 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bae6b6df-ce5b-473f-b03d-07b9d4380961-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:45 crc kubenswrapper[4817]: I0320 12:32:45.766686 4817 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bae6b6df-ce5b-473f-b03d-07b9d4380961-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.370019 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" event={"ID":"bae6b6df-ce5b-473f-b03d-07b9d4380961","Type":"ContainerDied","Data":"63b7d687eb9b8a7adb8422e2d6ef4d469b609ef0a9199d1a45f5f9631465fbcd"} Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.370408 4817 scope.go:117] "RemoveContainer" containerID="8453e26be0ed7fff720b5e9e12356fde75fb42563dd80f24adb0c2a99f53747b" Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.370178 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tcdxf" Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.422781 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.429962 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tcdxf"] Mar 20 12:32:46 crc kubenswrapper[4817]: I0320 12:32:46.670724 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae6b6df-ce5b-473f-b03d-07b9d4380961" path="/var/lib/kubelet/pods/bae6b6df-ce5b-473f-b03d-07b9d4380961/volumes" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.771161 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.774265 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sg4sx" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="registry-server" containerID="cri-o://6cbbc4f11ddc84d546f7bf97b32e3b3a4a89881e4c62cedee401dd8b238a9bf2" gracePeriod=30 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.790772 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.791055 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-52mbr" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="registry-server" containerID="cri-o://5980199afc2469b92fe453d8c2512860922e1b8b816e0392cbd8e9ebfc8c8870" gracePeriod=30 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.802725 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.803025 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" containerID="cri-o://d7fbc95e04c0ebb57881cd89ac7e44da142ea149ef0bff4b240efbfce2e0b62d" gracePeriod=30 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.814230 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.814448 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8cgvz" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="registry-server" containerID="cri-o://748c734a98db313973b466dd100361a9067d454ef30967160927628cd637d66c" gracePeriod=30 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.818885 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.819294 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7zb6t" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="registry-server" containerID="cri-o://bab84a0e1242105f894986d944df5eee95184f1940278823d4192f7776ad0746" gracePeriod=30 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.825974 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tcp45"] Mar 20 12:33:03 crc kubenswrapper[4817]: E0320 12:33:03.831259 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6b6df-ce5b-473f-b03d-07b9d4380961" containerName="registry" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.831290 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6b6df-ce5b-473f-b03d-07b9d4380961" containerName="registry" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.831452 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae6b6df-ce5b-473f-b03d-07b9d4380961" containerName="registry" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.831886 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.835736 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tcp45"] Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.974494 4817 generic.go:334] "Generic (PLEG): container finished" podID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerID="748c734a98db313973b466dd100361a9067d454ef30967160927628cd637d66c" exitCode=0 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.974597 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerDied","Data":"748c734a98db313973b466dd100361a9067d454ef30967160927628cd637d66c"} Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.984727 4817 generic.go:334] "Generic (PLEG): container finished" podID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerID="bab84a0e1242105f894986d944df5eee95184f1940278823d4192f7776ad0746" exitCode=0 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.984818 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerDied","Data":"bab84a0e1242105f894986d944df5eee95184f1940278823d4192f7776ad0746"} Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.987266 4817 generic.go:334] "Generic (PLEG): container finished" podID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerID="5980199afc2469b92fe453d8c2512860922e1b8b816e0392cbd8e9ebfc8c8870" exitCode=0 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.987313 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerDied","Data":"5980199afc2469b92fe453d8c2512860922e1b8b816e0392cbd8e9ebfc8c8870"} Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.988614 4817 generic.go:334] "Generic (PLEG): container finished" podID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerID="d7fbc95e04c0ebb57881cd89ac7e44da142ea149ef0bff4b240efbfce2e0b62d" exitCode=0 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.988685 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" event={"ID":"4d926e43-e19a-460f-8e87-1fe72e62d352","Type":"ContainerDied","Data":"d7fbc95e04c0ebb57881cd89ac7e44da142ea149ef0bff4b240efbfce2e0b62d"} Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.990234 4817 generic.go:334] "Generic (PLEG): container finished" podID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerID="6cbbc4f11ddc84d546f7bf97b32e3b3a4a89881e4c62cedee401dd8b238a9bf2" exitCode=0 Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.990266 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerDied","Data":"6cbbc4f11ddc84d546f7bf97b32e3b3a4a89881e4c62cedee401dd8b238a9bf2"} Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.995207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bllkm\" (UniqueName: \"kubernetes.io/projected/6a6ea42d-7322-4815-9996-def420f63525-kube-api-access-bllkm\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.995254 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6ea42d-7322-4815-9996-def420f63525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:03 crc kubenswrapper[4817]: I0320 12:33:03.995295 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6ea42d-7322-4815-9996-def420f63525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.096243 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6ea42d-7322-4815-9996-def420f63525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.096366 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bllkm\" (UniqueName: \"kubernetes.io/projected/6a6ea42d-7322-4815-9996-def420f63525-kube-api-access-bllkm\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.096393 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6ea42d-7322-4815-9996-def420f63525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.098415 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6a6ea42d-7322-4815-9996-def420f63525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.103966 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6a6ea42d-7322-4815-9996-def420f63525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.114573 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bllkm\" (UniqueName: \"kubernetes.io/projected/6a6ea42d-7322-4815-9996-def420f63525-kube-api-access-bllkm\") pod \"marketplace-operator-79b997595-tcp45\" (UID: \"6a6ea42d-7322-4815-9996-def420f63525\") " pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.174553 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.269379 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.302028 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd68d\" (UniqueName: \"kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d\") pod \"98014d2a-ca27-4147-a4cf-081ce9325a83\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.302068 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content\") pod \"98014d2a-ca27-4147-a4cf-081ce9325a83\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.302096 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities\") pod \"98014d2a-ca27-4147-a4cf-081ce9325a83\" (UID: \"98014d2a-ca27-4147-a4cf-081ce9325a83\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.305371 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities" (OuterVolumeSpecName: "utilities") pod "98014d2a-ca27-4147-a4cf-081ce9325a83" (UID: "98014d2a-ca27-4147-a4cf-081ce9325a83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.308250 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d" (OuterVolumeSpecName: "kube-api-access-wd68d") pod "98014d2a-ca27-4147-a4cf-081ce9325a83" (UID: "98014d2a-ca27-4147-a4cf-081ce9325a83"). InnerVolumeSpecName "kube-api-access-wd68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.329978 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.359206 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.361030 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.366364 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.401176 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98014d2a-ca27-4147-a4cf-081ce9325a83" (UID: "98014d2a-ca27-4147-a4cf-081ce9325a83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402642 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgmld\" (UniqueName: \"kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld\") pod \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402700 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7fk\" (UniqueName: \"kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk\") pod \"790f1757-c8f1-4a1b-93aa-c476aed2e981\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402740 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities\") pod \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402760 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities\") pod \"790f1757-c8f1-4a1b-93aa-c476aed2e981\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402797 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics\") pod \"4d926e43-e19a-460f-8e87-1fe72e62d352\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402824 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc678\" (UniqueName: \"kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678\") pod \"4d926e43-e19a-460f-8e87-1fe72e62d352\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402844 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content\") pod \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\" (UID: \"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402876 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content\") pod \"790f1757-c8f1-4a1b-93aa-c476aed2e981\" (UID: \"790f1757-c8f1-4a1b-93aa-c476aed2e981\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.402963 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca\") pod \"4d926e43-e19a-460f-8e87-1fe72e62d352\" (UID: \"4d926e43-e19a-460f-8e87-1fe72e62d352\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403466 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh4n2\" (UniqueName: \"kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2\") pod \"21934de0-bbff-4fcf-ad45-b3a6a2461030\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403663 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content\") pod \"21934de0-bbff-4fcf-ad45-b3a6a2461030\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403701 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities\") pod \"21934de0-bbff-4fcf-ad45-b3a6a2461030\" (UID: \"21934de0-bbff-4fcf-ad45-b3a6a2461030\") " Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403817 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities" (OuterVolumeSpecName: "utilities") pod "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" (UID: "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403885 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd68d\" (UniqueName: \"kubernetes.io/projected/98014d2a-ca27-4147-a4cf-081ce9325a83-kube-api-access-wd68d\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403903 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403917 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98014d2a-ca27-4147-a4cf-081ce9325a83-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.403884 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities" (OuterVolumeSpecName: "utilities") pod "790f1757-c8f1-4a1b-93aa-c476aed2e981" (UID: "790f1757-c8f1-4a1b-93aa-c476aed2e981"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.405242 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4d926e43-e19a-460f-8e87-1fe72e62d352" (UID: "4d926e43-e19a-460f-8e87-1fe72e62d352"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.405776 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities" (OuterVolumeSpecName: "utilities") pod "21934de0-bbff-4fcf-ad45-b3a6a2461030" (UID: "21934de0-bbff-4fcf-ad45-b3a6a2461030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.408143 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2" (OuterVolumeSpecName: "kube-api-access-zh4n2") pod "21934de0-bbff-4fcf-ad45-b3a6a2461030" (UID: "21934de0-bbff-4fcf-ad45-b3a6a2461030"). InnerVolumeSpecName "kube-api-access-zh4n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.409872 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld" (OuterVolumeSpecName: "kube-api-access-fgmld") pod "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" (UID: "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7"). InnerVolumeSpecName "kube-api-access-fgmld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.409982 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk" (OuterVolumeSpecName: "kube-api-access-wt7fk") pod "790f1757-c8f1-4a1b-93aa-c476aed2e981" (UID: "790f1757-c8f1-4a1b-93aa-c476aed2e981"). InnerVolumeSpecName "kube-api-access-wt7fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.410794 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4d926e43-e19a-460f-8e87-1fe72e62d352" (UID: "4d926e43-e19a-460f-8e87-1fe72e62d352"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 12:33:04 crc kubenswrapper[4817]: I0320 12:33:04.411489 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678" (OuterVolumeSpecName: "kube-api-access-bc678") pod "4d926e43-e19a-460f-8e87-1fe72e62d352" (UID: "4d926e43-e19a-460f-8e87-1fe72e62d352"). InnerVolumeSpecName "kube-api-access-bc678". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.446042 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21934de0-bbff-4fcf-ad45-b3a6a2461030" (UID: "21934de0-bbff-4fcf-ad45-b3a6a2461030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.469635 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tcp45"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.471010 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" (UID: "bd0c8df8-231e-4c91-8cfe-0182cd07e3d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505089 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505133 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505146 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505157 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505166 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc678\" (UniqueName: \"kubernetes.io/projected/4d926e43-e19a-460f-8e87-1fe72e62d352-kube-api-access-bc678\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505176 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d926e43-e19a-460f-8e87-1fe72e62d352-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505187 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh4n2\" (UniqueName: \"kubernetes.io/projected/21934de0-bbff-4fcf-ad45-b3a6a2461030-kube-api-access-zh4n2\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505197 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505207 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21934de0-bbff-4fcf-ad45-b3a6a2461030-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505217 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgmld\" (UniqueName: \"kubernetes.io/projected/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7-kube-api-access-fgmld\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.505240 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7fk\" (UniqueName: \"kubernetes.io/projected/790f1757-c8f1-4a1b-93aa-c476aed2e981-kube-api-access-wt7fk\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.587287 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "790f1757-c8f1-4a1b-93aa-c476aed2e981" (UID: "790f1757-c8f1-4a1b-93aa-c476aed2e981"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.606228 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790f1757-c8f1-4a1b-93aa-c476aed2e981-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.996388 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.996388 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bqnjh" event={"ID":"4d926e43-e19a-460f-8e87-1fe72e62d352","Type":"ContainerDied","Data":"d3034b987c63b19823c9011c2fc43896cf70eb054d274b0efd4c577f3316220c"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.996432 4817 scope.go:117] "RemoveContainer" containerID="d7fbc95e04c0ebb57881cd89ac7e44da142ea149ef0bff4b240efbfce2e0b62d" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.999857 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sg4sx" event={"ID":"98014d2a-ca27-4147-a4cf-081ce9325a83","Type":"ContainerDied","Data":"3002d49f2c49e22e53d595b3cd36a8c0c7db2089a4bc0017d36c3b5b80df5325"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:04.999965 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sg4sx" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.003606 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cgvz" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.003620 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cgvz" event={"ID":"21934de0-bbff-4fcf-ad45-b3a6a2461030","Type":"ContainerDied","Data":"b03e99425704d8d2a366d6629c056c159a7bc4aaea203519dbd53ff607bab8d8"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.006596 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7zb6t" event={"ID":"790f1757-c8f1-4a1b-93aa-c476aed2e981","Type":"ContainerDied","Data":"f33406ff95cf8aef48d22004d4cd507657d7b3bffa280beb3bb091be741dbe19"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.006608 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7zb6t" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.008914 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" event={"ID":"6a6ea42d-7322-4815-9996-def420f63525","Type":"ContainerStarted","Data":"280c7e82b34452b1072d4db951e0fc9f2ae70ea326051ae675c1d4575fd8f348"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.008954 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" event={"ID":"6a6ea42d-7322-4815-9996-def420f63525","Type":"ContainerStarted","Data":"4ed107f8e66205b3231cbefd4d14cfae817027e352d227480dbbf76ef11cdb9f"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.009157 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.012680 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52mbr" event={"ID":"bd0c8df8-231e-4c91-8cfe-0182cd07e3d7","Type":"ContainerDied","Data":"2e71738029eb6ff1dc463a53a52d058c153f0648f37bf4d8a91bdc49c61093c0"} Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.012763 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52mbr" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.013724 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.029272 4817 scope.go:117] "RemoveContainer" containerID="6cbbc4f11ddc84d546f7bf97b32e3b3a4a89881e4c62cedee401dd8b238a9bf2" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.043844 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.060540 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tcp45" podStartSLOduration=2.060517914 podStartE2EDuration="2.060517914s" podCreationTimestamp="2026-03-20 12:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 12:33:05.036469608 +0000 UTC m=+347.124782411" watchObservedRunningTime="2026-03-20 12:33:05.060517914 +0000 UTC m=+347.148830697" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.061835 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cgvz"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.075709 4817 scope.go:117] "RemoveContainer" containerID="9700e09fcd2f7975e31e92f09f9a347557d38d9caa6d51d222e5629fa3d34ced" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.076595 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.081869 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sg4sx"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.085736 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.090014 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7zb6t"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.097021 4817 scope.go:117] "RemoveContainer" containerID="440749064c710205ebd5a7f89aae0c561114b9be1c40b20337232833b6b1cac9" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.097146 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.102964 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bqnjh"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.124756 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.127421 4817 scope.go:117] "RemoveContainer" containerID="748c734a98db313973b466dd100361a9067d454ef30967160927628cd637d66c" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.131834 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-52mbr"] Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.141601 4817 scope.go:117] "RemoveContainer" containerID="33bd90d283bfacaa66e5d5c5ec21c5b6bff7eb5df7023e0b6cdc987c159411e5" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.154773 4817 scope.go:117] "RemoveContainer" containerID="eaa046d5703f712a98eb9301405f61bccdd8829d572d806398c85723f0a5e146" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.177341 4817 scope.go:117] "RemoveContainer" containerID="bab84a0e1242105f894986d944df5eee95184f1940278823d4192f7776ad0746" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.194282 4817 scope.go:117] "RemoveContainer" containerID="15c5e7233dfcf3f9bb645402d51c258820e0c0d294e0a80990824124f0833118" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.209095 4817 scope.go:117] "RemoveContainer" containerID="75069fe013ef2233d4ea44594c5a419a2343a81a9b627f0cab51ff6bea4e1f26" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.224431 4817 scope.go:117] "RemoveContainer" containerID="5980199afc2469b92fe453d8c2512860922e1b8b816e0392cbd8e9ebfc8c8870" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.239973 4817 scope.go:117] "RemoveContainer" containerID="0a675bf20f679ec802c7d556ea2142718794f40f1403c1dd1b9fc2165b250629" Mar 20 12:33:05 crc kubenswrapper[4817]: I0320 12:33:05.259902 4817 scope.go:117] "RemoveContainer" containerID="c4e6ff2eccbddf87b22af1daac19c2acad517e0394af07082a881dd04748e654" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.670428 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" path="/var/lib/kubelet/pods/21934de0-bbff-4fcf-ad45-b3a6a2461030/volumes" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.671658 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" path="/var/lib/kubelet/pods/4d926e43-e19a-460f-8e87-1fe72e62d352/volumes" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.672378 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" path="/var/lib/kubelet/pods/790f1757-c8f1-4a1b-93aa-c476aed2e981/volumes" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.673821 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" path="/var/lib/kubelet/pods/98014d2a-ca27-4147-a4cf-081ce9325a83/volumes" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.674641 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" path="/var/lib/kubelet/pods/bd0c8df8-231e-4c91-8cfe-0182cd07e3d7/volumes" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.792629 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2ffz"] Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.792897 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.792921 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.792936 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.792945 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.792955 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.792963 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.792976 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.792983 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.792993 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793002 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793014 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793023 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793039 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793046 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="extract-content" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793055 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793063 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793078 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793086 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793096 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793103 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793111 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793135 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793146 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793153 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: E0320 12:33:06.793162 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793169 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="extract-utilities" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793275 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d926e43-e19a-460f-8e87-1fe72e62d352" containerName="marketplace-operator" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793289 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="21934de0-bbff-4fcf-ad45-b3a6a2461030" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793300 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0c8df8-231e-4c91-8cfe-0182cd07e3d7" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793307 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="790f1757-c8f1-4a1b-93aa-c476aed2e981" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.793314 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="98014d2a-ca27-4147-a4cf-081ce9325a83" containerName="registry-server" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.794413 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.797525 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.804642 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2ffz"] Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.848732 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-catalog-content\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.848831 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cpk\" (UniqueName: \"kubernetes.io/projected/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-kube-api-access-85cpk\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.848860 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-utilities\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.950634 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cpk\" (UniqueName: \"kubernetes.io/projected/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-kube-api-access-85cpk\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.950709 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-utilities\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.950779 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-catalog-content\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.951760 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-utilities\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.952173 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-catalog-content\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:06 crc kubenswrapper[4817]: I0320 12:33:06.973373 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cpk\" (UniqueName: \"kubernetes.io/projected/ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4-kube-api-access-85cpk\") pod \"redhat-marketplace-x2ffz\" (UID: \"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4\") " pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.116250 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.386743 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9tlt"] Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.389316 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.391358 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.393872 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9tlt"] Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.573066 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zntq2\" (UniqueName: \"kubernetes.io/projected/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-kube-api-access-zntq2\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.573140 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-utilities\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.573241 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-catalog-content\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.578382 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2ffz"] Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.674498 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zntq2\" (UniqueName: \"kubernetes.io/projected/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-kube-api-access-zntq2\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.674565 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-utilities\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.674604 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-catalog-content\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.675003 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-catalog-content\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.675198 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-utilities\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.690620 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zntq2\" (UniqueName: \"kubernetes.io/projected/515ff5a8-2336-4d6c-8156-fcf0a1b5ed14-kube-api-access-zntq2\") pod \"redhat-operators-n9tlt\" (UID: \"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14\") " pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:07 crc kubenswrapper[4817]: I0320 12:33:07.707367 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:08 crc kubenswrapper[4817]: I0320 12:33:08.035799 4817 generic.go:334] "Generic (PLEG): container finished" podID="ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4" containerID="a3effb4ed774e1076c3fee835e6a61b962156d21b8f7384d78a46d0747db5228" exitCode=0 Mar 20 12:33:08 crc kubenswrapper[4817]: I0320 12:33:08.035861 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2ffz" event={"ID":"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4","Type":"ContainerDied","Data":"a3effb4ed774e1076c3fee835e6a61b962156d21b8f7384d78a46d0747db5228"} Mar 20 12:33:08 crc kubenswrapper[4817]: I0320 12:33:08.035939 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2ffz" event={"ID":"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4","Type":"ContainerStarted","Data":"14299b82d492ed67b8b339178ce2c76cf36be850ef91c4eb259e09358e31763c"} Mar 20 12:33:08 crc kubenswrapper[4817]: I0320 12:33:08.088543 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9tlt"] Mar 20 12:33:08 crc kubenswrapper[4817]: W0320 12:33:08.094177 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod515ff5a8_2336_4d6c_8156_fcf0a1b5ed14.slice/crio-04edafb044deb4c3acf1568c19369d066442f829916122846d7a1de55742ce02 WatchSource:0}: Error finding container 04edafb044deb4c3acf1568c19369d066442f829916122846d7a1de55742ce02: Status 404 returned error can't find the container with id 04edafb044deb4c3acf1568c19369d066442f829916122846d7a1de55742ce02 Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.051093 4817 generic.go:334] "Generic (PLEG): container finished" podID="515ff5a8-2336-4d6c-8156-fcf0a1b5ed14" containerID="38d60638005f342127d7a58ae89afb8a5c6ca86e74f8f89feada8e3aca734807" exitCode=0 Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.051172 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9tlt" event={"ID":"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14","Type":"ContainerDied","Data":"38d60638005f342127d7a58ae89afb8a5c6ca86e74f8f89feada8e3aca734807"} Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.051576 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9tlt" event={"ID":"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14","Type":"ContainerStarted","Data":"04edafb044deb4c3acf1568c19369d066442f829916122846d7a1de55742ce02"} Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.310074 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rh8jk"] Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.311279 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.315496 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.338230 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh8jk"] Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.399213 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqssz\" (UniqueName: \"kubernetes.io/projected/96e109e0-981e-4c7f-a057-722a41195da9-kube-api-access-jqssz\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.399264 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-utilities\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.399283 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-catalog-content\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.499937 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqssz\" (UniqueName: \"kubernetes.io/projected/96e109e0-981e-4c7f-a057-722a41195da9-kube-api-access-jqssz\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.499996 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-utilities\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.500015 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-catalog-content\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.500460 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-utilities\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.500515 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e109e0-981e-4c7f-a057-722a41195da9-catalog-content\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.527518 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqssz\" (UniqueName: \"kubernetes.io/projected/96e109e0-981e-4c7f-a057-722a41195da9-kube-api-access-jqssz\") pod \"community-operators-rh8jk\" (UID: \"96e109e0-981e-4c7f-a057-722a41195da9\") " pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.631225 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.797689 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-62q6v"] Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.799850 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.802863 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.807498 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-62q6v"] Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.904989 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-utilities\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.905036 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-catalog-content\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:09 crc kubenswrapper[4817]: I0320 12:33:09.905100 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc8w\" (UniqueName: \"kubernetes.io/projected/7d2868be-8fe3-4308-b80d-c81e3817f32c-kube-api-access-wzc8w\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.005990 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc8w\" (UniqueName: \"kubernetes.io/projected/7d2868be-8fe3-4308-b80d-c81e3817f32c-kube-api-access-wzc8w\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.006054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-utilities\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.006089 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-catalog-content\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.006631 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-utilities\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.006680 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d2868be-8fe3-4308-b80d-c81e3817f32c-catalog-content\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.024508 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc8w\" (UniqueName: \"kubernetes.io/projected/7d2868be-8fe3-4308-b80d-c81e3817f32c-kube-api-access-wzc8w\") pod \"certified-operators-62q6v\" (UID: \"7d2868be-8fe3-4308-b80d-c81e3817f32c\") " pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.062083 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rh8jk"] Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.070377 4817 generic.go:334] "Generic (PLEG): container finished" podID="ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4" containerID="42f6a88b34d4f22dfa03095e2595bb7304d630eaa743836febb6b3978f81d3d1" exitCode=0 Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.070435 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2ffz" event={"ID":"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4","Type":"ContainerDied","Data":"42f6a88b34d4f22dfa03095e2595bb7304d630eaa743836febb6b3978f81d3d1"} Mar 20 12:33:10 crc kubenswrapper[4817]: W0320 12:33:10.073155 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e109e0_981e_4c7f_a057_722a41195da9.slice/crio-b83f3a6a470a9b016632977968f66b069ebea957f46897a854381394c51290da WatchSource:0}: Error finding container b83f3a6a470a9b016632977968f66b069ebea957f46897a854381394c51290da: Status 404 returned error can't find the container with id b83f3a6a470a9b016632977968f66b069ebea957f46897a854381394c51290da Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.124688 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:10 crc kubenswrapper[4817]: I0320 12:33:10.566243 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-62q6v"] Mar 20 12:33:10 crc kubenswrapper[4817]: W0320 12:33:10.578388 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d2868be_8fe3_4308_b80d_c81e3817f32c.slice/crio-f51871d22d2582ae121ef7d8bf07ed10b7395320be9f7c1d25a3a973a9bda5c2 WatchSource:0}: Error finding container f51871d22d2582ae121ef7d8bf07ed10b7395320be9f7c1d25a3a973a9bda5c2: Status 404 returned error can't find the container with id f51871d22d2582ae121ef7d8bf07ed10b7395320be9f7c1d25a3a973a9bda5c2 Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.075989 4817 generic.go:334] "Generic (PLEG): container finished" podID="96e109e0-981e-4c7f-a057-722a41195da9" containerID="e4d64b4c62877124339b6880432a86c3bffc42dac741eb46907d0bfc3eb2930f" exitCode=0 Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.076096 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh8jk" event={"ID":"96e109e0-981e-4c7f-a057-722a41195da9","Type":"ContainerDied","Data":"e4d64b4c62877124339b6880432a86c3bffc42dac741eb46907d0bfc3eb2930f"} Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.076159 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh8jk" event={"ID":"96e109e0-981e-4c7f-a057-722a41195da9","Type":"ContainerStarted","Data":"b83f3a6a470a9b016632977968f66b069ebea957f46897a854381394c51290da"} Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.078288 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2ffz" event={"ID":"ee2a9175-cd7a-4dc2-ba0a-40b2403a3dd4","Type":"ContainerStarted","Data":"38f2667616ab1cecf82fe7ded35b1c41c3a60d3e3469e975429217e35431e34a"} Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.079433 4817 generic.go:334] "Generic (PLEG): container finished" podID="7d2868be-8fe3-4308-b80d-c81e3817f32c" containerID="47963a0b12e381984a269f6a751b1bf735234ce11fad8b968139dd43a4b00d87" exitCode=0 Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.079469 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-62q6v" event={"ID":"7d2868be-8fe3-4308-b80d-c81e3817f32c","Type":"ContainerDied","Data":"47963a0b12e381984a269f6a751b1bf735234ce11fad8b968139dd43a4b00d87"} Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.079487 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-62q6v" event={"ID":"7d2868be-8fe3-4308-b80d-c81e3817f32c","Type":"ContainerStarted","Data":"f51871d22d2582ae121ef7d8bf07ed10b7395320be9f7c1d25a3a973a9bda5c2"} Mar 20 12:33:11 crc kubenswrapper[4817]: I0320 12:33:11.133791 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2ffz" podStartSLOduration=2.676306733 podStartE2EDuration="5.133775042s" podCreationTimestamp="2026-03-20 12:33:06 +0000 UTC" firstStartedPulling="2026-03-20 12:33:08.037842648 +0000 UTC m=+350.126155471" lastFinishedPulling="2026-03-20 12:33:10.495310967 +0000 UTC m=+352.583623780" observedRunningTime="2026-03-20 12:33:11.131800461 +0000 UTC m=+353.220113244" watchObservedRunningTime="2026-03-20 12:33:11.133775042 +0000 UTC m=+353.222087825" Mar 20 12:33:12 crc kubenswrapper[4817]: I0320 12:33:12.090394 4817 generic.go:334] "Generic (PLEG): container finished" podID="515ff5a8-2336-4d6c-8156-fcf0a1b5ed14" containerID="5522f9888afba8226d481a612e67074cca4185ed4465a7b9a94de7cd370cae5d" exitCode=0 Mar 20 12:33:12 crc kubenswrapper[4817]: I0320 12:33:12.090595 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9tlt" event={"ID":"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14","Type":"ContainerDied","Data":"5522f9888afba8226d481a612e67074cca4185ed4465a7b9a94de7cd370cae5d"} Mar 20 12:33:12 crc kubenswrapper[4817]: I0320 12:33:12.101683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh8jk" event={"ID":"96e109e0-981e-4c7f-a057-722a41195da9","Type":"ContainerStarted","Data":"b116b8da814cc731ecfd0d38da047b3dcd86ca520fd723d0144707c188560c37"} Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.108273 4817 generic.go:334] "Generic (PLEG): container finished" podID="7d2868be-8fe3-4308-b80d-c81e3817f32c" containerID="8fe2c62705ee4c95caeba0046659517ab6c0c9bb5b5a312aa7f6694fc7b9c43d" exitCode=0 Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.108356 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-62q6v" event={"ID":"7d2868be-8fe3-4308-b80d-c81e3817f32c","Type":"ContainerDied","Data":"8fe2c62705ee4c95caeba0046659517ab6c0c9bb5b5a312aa7f6694fc7b9c43d"} Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.134350 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9tlt" event={"ID":"515ff5a8-2336-4d6c-8156-fcf0a1b5ed14","Type":"ContainerStarted","Data":"0621c3ffc6772b8c910b5509d0004401372b767386907482b0584d420e795460"} Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.137947 4817 generic.go:334] "Generic (PLEG): container finished" podID="96e109e0-981e-4c7f-a057-722a41195da9" containerID="b116b8da814cc731ecfd0d38da047b3dcd86ca520fd723d0144707c188560c37" exitCode=0 Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.138066 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh8jk" event={"ID":"96e109e0-981e-4c7f-a057-722a41195da9","Type":"ContainerDied","Data":"b116b8da814cc731ecfd0d38da047b3dcd86ca520fd723d0144707c188560c37"} Mar 20 12:33:13 crc kubenswrapper[4817]: I0320 12:33:13.187448 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9tlt" podStartSLOduration=2.652810626 podStartE2EDuration="6.187428533s" podCreationTimestamp="2026-03-20 12:33:07 +0000 UTC" firstStartedPulling="2026-03-20 12:33:09.065649829 +0000 UTC m=+351.153962612" lastFinishedPulling="2026-03-20 12:33:12.600267736 +0000 UTC m=+354.688580519" observedRunningTime="2026-03-20 12:33:13.186251817 +0000 UTC m=+355.274564610" watchObservedRunningTime="2026-03-20 12:33:13.187428533 +0000 UTC m=+355.275741326" Mar 20 12:33:14 crc kubenswrapper[4817]: I0320 12:33:14.144840 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-62q6v" event={"ID":"7d2868be-8fe3-4308-b80d-c81e3817f32c","Type":"ContainerStarted","Data":"e132ea00c753f5e030043debccea47cdc3ed3f61a4aed2842a5dde780c5e2b7c"} Mar 20 12:33:14 crc kubenswrapper[4817]: I0320 12:33:14.147411 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rh8jk" event={"ID":"96e109e0-981e-4c7f-a057-722a41195da9","Type":"ContainerStarted","Data":"060f4a4690dc3adb7ea5a55a637c1a9f8e9602a45e6b6883fe9d3a620cce52ce"} Mar 20 12:33:14 crc kubenswrapper[4817]: I0320 12:33:14.167401 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-62q6v" podStartSLOduration=2.690777556 podStartE2EDuration="5.167378019s" podCreationTimestamp="2026-03-20 12:33:09 +0000 UTC" firstStartedPulling="2026-03-20 12:33:11.080400199 +0000 UTC m=+353.168712982" lastFinishedPulling="2026-03-20 12:33:13.557000652 +0000 UTC m=+355.645313445" observedRunningTime="2026-03-20 12:33:14.16446963 +0000 UTC m=+356.252782403" watchObservedRunningTime="2026-03-20 12:33:14.167378019 +0000 UTC m=+356.255690802" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.117271 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.117908 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.175941 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.199065 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rh8jk" podStartSLOduration=5.685238314 podStartE2EDuration="8.199040606s" podCreationTimestamp="2026-03-20 12:33:09 +0000 UTC" firstStartedPulling="2026-03-20 12:33:11.077804759 +0000 UTC m=+353.166117542" lastFinishedPulling="2026-03-20 12:33:13.591607051 +0000 UTC m=+355.679919834" observedRunningTime="2026-03-20 12:33:14.19942619 +0000 UTC m=+356.287738973" watchObservedRunningTime="2026-03-20 12:33:17.199040606 +0000 UTC m=+359.287353409" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.238053 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2ffz" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.708589 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:17 crc kubenswrapper[4817]: I0320 12:33:17.708999 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:18 crc kubenswrapper[4817]: I0320 12:33:18.761868 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n9tlt" podUID="515ff5a8-2336-4d6c-8156-fcf0a1b5ed14" containerName="registry-server" probeResult="failure" output=< Mar 20 12:33:18 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Mar 20 12:33:18 crc kubenswrapper[4817]: > Mar 20 12:33:19 crc kubenswrapper[4817]: I0320 12:33:19.631642 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:19 crc kubenswrapper[4817]: I0320 12:33:19.631726 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:19 crc kubenswrapper[4817]: I0320 12:33:19.696147 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:20 crc kubenswrapper[4817]: I0320 12:33:20.125367 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:20 crc kubenswrapper[4817]: I0320 12:33:20.125783 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:20 crc kubenswrapper[4817]: I0320 12:33:20.190619 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:20 crc kubenswrapper[4817]: I0320 12:33:20.234599 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rh8jk" Mar 20 12:33:20 crc kubenswrapper[4817]: I0320 12:33:20.243981 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-62q6v" Mar 20 12:33:27 crc kubenswrapper[4817]: I0320 12:33:27.762280 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:27 crc kubenswrapper[4817]: I0320 12:33:27.814208 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9tlt" Mar 20 12:33:55 crc kubenswrapper[4817]: I0320 12:33:55.574208 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:33:55 crc kubenswrapper[4817]: I0320 12:33:55.574835 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.750342 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566834-pzvwc"] Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.752294 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.755389 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566834-pzvwc"] Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.755698 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.755917 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lqzqd" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.756019 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.878691 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lgq6\" (UniqueName: \"kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6\") pod \"auto-csr-approver-29566834-pzvwc\" (UID: \"485a15e2-4849-4f55-a6b3-45b55ad2c723\") " pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:00 crc kubenswrapper[4817]: I0320 12:34:00.981037 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lgq6\" (UniqueName: \"kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6\") pod \"auto-csr-approver-29566834-pzvwc\" (UID: \"485a15e2-4849-4f55-a6b3-45b55ad2c723\") " pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:01 crc kubenswrapper[4817]: I0320 12:34:01.008522 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lgq6\" (UniqueName: \"kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6\") pod \"auto-csr-approver-29566834-pzvwc\" (UID: \"485a15e2-4849-4f55-a6b3-45b55ad2c723\") " pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:01 crc kubenswrapper[4817]: I0320 12:34:01.106935 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:01 crc kubenswrapper[4817]: I0320 12:34:01.512016 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566834-pzvwc"] Mar 20 12:34:01 crc kubenswrapper[4817]: I0320 12:34:01.733582 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" event={"ID":"485a15e2-4849-4f55-a6b3-45b55ad2c723","Type":"ContainerStarted","Data":"ab5965e5dfc7855b2aafa8c07c9da474a93cf460c0de816acea5527e7e854334"} Mar 20 12:34:03 crc kubenswrapper[4817]: I0320 12:34:03.750698 4817 generic.go:334] "Generic (PLEG): container finished" podID="485a15e2-4849-4f55-a6b3-45b55ad2c723" containerID="5d35677d646bd6d431b83cada8acc804dc7f737d8a07f7e9478890efcab4c536" exitCode=0 Mar 20 12:34:03 crc kubenswrapper[4817]: I0320 12:34:03.750814 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" event={"ID":"485a15e2-4849-4f55-a6b3-45b55ad2c723","Type":"ContainerDied","Data":"5d35677d646bd6d431b83cada8acc804dc7f737d8a07f7e9478890efcab4c536"} Mar 20 12:34:04 crc kubenswrapper[4817]: I0320 12:34:04.996785 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.044468 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lgq6\" (UniqueName: \"kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6\") pod \"485a15e2-4849-4f55-a6b3-45b55ad2c723\" (UID: \"485a15e2-4849-4f55-a6b3-45b55ad2c723\") " Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.051940 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6" (OuterVolumeSpecName: "kube-api-access-9lgq6") pod "485a15e2-4849-4f55-a6b3-45b55ad2c723" (UID: "485a15e2-4849-4f55-a6b3-45b55ad2c723"). InnerVolumeSpecName "kube-api-access-9lgq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.146477 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lgq6\" (UniqueName: \"kubernetes.io/projected/485a15e2-4849-4f55-a6b3-45b55ad2c723-kube-api-access-9lgq6\") on node \"crc\" DevicePath \"\"" Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.766354 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" event={"ID":"485a15e2-4849-4f55-a6b3-45b55ad2c723","Type":"ContainerDied","Data":"ab5965e5dfc7855b2aafa8c07c9da474a93cf460c0de816acea5527e7e854334"} Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.766393 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5965e5dfc7855b2aafa8c07c9da474a93cf460c0de816acea5527e7e854334" Mar 20 12:34:05 crc kubenswrapper[4817]: I0320 12:34:05.766438 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566834-pzvwc" Mar 20 12:34:25 crc kubenswrapper[4817]: I0320 12:34:25.575334 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:34:25 crc kubenswrapper[4817]: I0320 12:34:25.576519 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:34:55 crc kubenswrapper[4817]: I0320 12:34:55.575452 4817 patch_prober.go:28] interesting pod/machine-config-daemon-dch6v container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 12:34:55 crc kubenswrapper[4817]: I0320 12:34:55.576187 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 12:34:55 crc kubenswrapper[4817]: I0320 12:34:55.576260 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" Mar 20 12:34:55 crc kubenswrapper[4817]: I0320 12:34:55.577304 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf1ae3dbdd47367c661bf7c7e6a843dda21fb19d5950ad9e49f9dd2202c0159b"} pod="openshift-machine-config-operator/machine-config-daemon-dch6v" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 12:34:55 crc kubenswrapper[4817]: I0320 12:34:55.577402 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" podUID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerName="machine-config-daemon" containerID="cri-o://bf1ae3dbdd47367c661bf7c7e6a843dda21fb19d5950ad9e49f9dd2202c0159b" gracePeriod=600 Mar 20 12:34:56 crc kubenswrapper[4817]: I0320 12:34:56.114370 4817 generic.go:334] "Generic (PLEG): container finished" podID="c8b7e138-8c64-47fb-84b7-4a42e612947d" containerID="bf1ae3dbdd47367c661bf7c7e6a843dda21fb19d5950ad9e49f9dd2202c0159b" exitCode=0 Mar 20 12:34:56 crc kubenswrapper[4817]: I0320 12:34:56.114452 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerDied","Data":"bf1ae3dbdd47367c661bf7c7e6a843dda21fb19d5950ad9e49f9dd2202c0159b"} Mar 20 12:34:56 crc kubenswrapper[4817]: I0320 12:34:56.115494 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dch6v" event={"ID":"c8b7e138-8c64-47fb-84b7-4a42e612947d","Type":"ContainerStarted","Data":"d53f9c30d9bf50bfd68a8c108296a5169de44af1ec7b12b862961114f0bc69ff"} Mar 20 12:34:56 crc kubenswrapper[4817]: I0320 12:34:56.115565 4817 scope.go:117] "RemoveContainer" containerID="bf0956e93ab88c31a4db3e9b805bb42b61e80dc5e4176715cb441ce0c2ff0420" Mar 20 12:35:19 crc kubenswrapper[4817]: I0320 12:35:19.253049 4817 scope.go:117] "RemoveContainer" containerID="2cc1ab1f8285d1fe593f437cf1b38be716ccde5bc67c75ba36f25a4a9b59d1cf" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.158226 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566836-xxsxw"] Mar 20 12:36:00 crc kubenswrapper[4817]: E0320 12:36:00.159240 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485a15e2-4849-4f55-a6b3-45b55ad2c723" containerName="oc" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.159264 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="485a15e2-4849-4f55-a6b3-45b55ad2c723" containerName="oc" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.159432 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="485a15e2-4849-4f55-a6b3-45b55ad2c723" containerName="oc" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.160050 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566836-xxsxw" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.166703 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566836-xxsxw"] Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.167032 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.167085 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-lqzqd" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.167622 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.255594 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fz7z\" (UniqueName: \"kubernetes.io/projected/99252e54-1a99-42a3-8d5d-67b1ab1633db-kube-api-access-9fz7z\") pod \"auto-csr-approver-29566836-xxsxw\" (UID: \"99252e54-1a99-42a3-8d5d-67b1ab1633db\") " pod="openshift-infra/auto-csr-approver-29566836-xxsxw" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.358815 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fz7z\" (UniqueName: \"kubernetes.io/projected/99252e54-1a99-42a3-8d5d-67b1ab1633db-kube-api-access-9fz7z\") pod \"auto-csr-approver-29566836-xxsxw\" (UID: \"99252e54-1a99-42a3-8d5d-67b1ab1633db\") " pod="openshift-infra/auto-csr-approver-29566836-xxsxw" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.386112 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fz7z\" (UniqueName: \"kubernetes.io/projected/99252e54-1a99-42a3-8d5d-67b1ab1633db-kube-api-access-9fz7z\") pod \"auto-csr-approver-29566836-xxsxw\" (UID: \"99252e54-1a99-42a3-8d5d-67b1ab1633db\") " pod="openshift-infra/auto-csr-approver-29566836-xxsxw" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.490302 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566836-xxsxw" Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.971319 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566836-xxsxw"] Mar 20 12:36:00 crc kubenswrapper[4817]: I0320 12:36:00.993631 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider